Two's bloody complement?!?!?

1
Well, here I am, writing up the notes for a test in computer systems and programming tomorrow, and I'm looking though the stuff we have to learn. We've only been on the course for a few weeks so I wasn't expecting miracles, and at the moment the only thing I've encountered that was scary was Database normalisation.

Anyway, I was writing up about Bitmaps (back in my day they were UDG's and proud of it btw) and I noticed that one of the things you have to learn is two's complement in Binary.

What is the obsession with teaching people this? It's not hard, I'll admit, but it is totally irellavent in modern programming, it's something that happens at CPU level, even if you code in assembler, you aren't going to encounter it.

It's like all the University 'Old School' profs get together to reminisce about those days when programming didn't just involve telling the computer what to do, but telling it how to do it as well. Theres no need for it, Binary is useful and all, I can even understand teaching Boolean logic, but I wish they'd let 2's comp rest in peace.

Rant over :)

3
Indeeds.

We had to do a semester of binary and the "little man computer" or whatever it was for networks. Now that to me, seems irelevant to someone studying web applications... =/

I guess its better knowing that not knowing in some cases though. Just wish they didn't spend so much bloody time on it. Waste of half a year and my money.
Grug
Returned Loveable SectorGame Addict

The Apocalypse Project | Machina Terra | Lost Souls | Starfox: Shadows of Lylat | Stargate SG1: Earth's Defense

Re: Two's bloody complement?!?!?

4
Flipside wrote:Well, here I am, writing up the notes for a test in computer systems and programming tomorrow, and I'm looking though the stuff we have to learn. We've only been on the course for a few weeks so I wasn't expecting miracles, and at the moment the only thing I've encountered that was scary was Database normalisation.

Anyway, I was writing up about Bitmaps (back in my day they were UDG's and proud of it btw) and I noticed that one of the things you have to learn is two's complement in Binary.

What is the obsession with teaching people this? It's not hard, I'll admit, but it is totally irellavent in modern programming, it's something that happens at CPU level, even if you code in assembler, you aren't going to encounter it.

It's like all the University 'Old School' profs get together to reminisce about those days when programming didn't just involve telling the computer what to do, but telling it how to do it as well. Theres no need for it, Binary is useful and all, I can even understand teaching Boolean logic, but I wish they'd let 2's comp rest in peace.

Rant over :)
I have to admit never using it in my work life, but it is a necessary touchstone for doing even basic classes in computer architecture (such as MIPS).

6
You did MIPS? I remember my CAD (Computer Architecture and Design) lecturer being rather vocal(ly damning) that only Strathclyde and Glasgow (and maybe Edinburgh) actually did the subject in depth for 'plain' CS degrees.

8
Computer Organization and Design (Patterson & Hennesey) is a good place to start. And finish, come to think of it - it's a huge, rather mind-numbing but very comprehensive book.

CAD was a hard class - so hard they moved it from 2nd to 3rd year. Albeit I got 95% and destoyed the exam (my lecturer was literally asking me to stop writing so much about 2 thirds into the exam because I'd filled 2 exam scripts), so I must have went right somewhere :D. Just don't ask me how.....

9
aldo wrote:Computer Organization and Design (Patterson & Hennesey) is a good place to start. And finish, come to think of it - it's a huge, rather mind-numbing but very comprehensive book.

CAD was a hard class - so hard they moved it from 2nd to 3rd year. Albeit I got 95% and destoyed the exam (my lecturer was literally asking me to stop writing so much about 2 thirds into the exam because I'd filled 2 exam scripts), so I must have went right somewhere :D. Just don't ask me how.....
Smart ass. =/

You got a good memory or something?
Grug
Returned Loveable SectorGame Addict

The Apocalypse Project | Machina Terra | Lost Souls | Starfox: Shadows of Lylat | Stargate SG1: Earth's Defense

10
Grug wrote:
aldo wrote:Computer Organization and Design (Patterson & Hennesey) is a good place to start. And finish, come to think of it - it's a huge, rather mind-numbing but very comprehensive book.

CAD was a hard class - so hard they moved it from 2nd to 3rd year. Albeit I got 95% and destoyed the exam (my lecturer was literally asking me to stop writing so much about 2 thirds into the exam because I'd filled 2 exam scripts), so I must have went right somewhere :D. Just don't ask me how.....
Smart ass. =/

You got a good memory or something?
Did lots of work on cache interleaving researching the Gamecube for the project. It's largely pot luck what you get in the exam, most of the time. Still, you can't blame me for being well chuffed at getting my 2nd highest uni score on supposedly one of the hardest classes in the course if not the whole department.

Can I remember it all now? Can I f###! :D

Re: Two's bloody complement?!?!?

11
aldo wrote:
I have to admit never using it in my work life, but it is a necessary touchstone for doing even basic classes in computer architecture (such as MIPS).
True, if someone is designing board architecture, and specifically needs to hardwire the maths through NOT gates etc, then I'll agree it's essential, but when you are working at compiler level and are highly unlikely to go below it, it's pretty pointless. The thing is, nowadays, it's so rare to need a bespoke maths setup, but 2's comp would still be useful.

Personally, I just think the time on a programming course could be better spent drilling Boolean into the class, especially the OR/XOR difference, which always makes me laugh when it catches people on the hop ;) Not only is it useful in a practical sense, but really helps understand, in my opinion, the way a computer 'thinks'. I actually found learning about logic gates made thinking about flowcharts etc a lot easier :)
Post Reply

Return to “General Discussion”