As the old joke goes there are 10 types of people in the world. Those who understand Binary and those who don't. There seem to be fewer of those Binary understanding people in the world today. Or do I just ravel in the wrong circles? Does computer science really require a knowledge of different number systems anymore?
I learned different number bases in around 5th or 6th grade. I thought they were a blast. I played around counting in things like base 5 and 7 for weeks afterwards. The pattern and structure of numbers just made so much sense to me after that and it was fun seeing how it worked in different variations.
Then when I started taking computer science classes and Binary and Octal (Octal works very nicely to group things on computers with 12 bit words) were part of the daily vocabulary I felt right at home. Later using Hexadecimal was also pretty easy though I admit to being geek enough that I had a calculator that used Decimal, Hex, Octal and Binary and I used it quite a bit.
Back in the day I used Octal and Hex arithmetic to work my way through stack dumps, crash dumps, and to make patches in code of all sorts. Binary was the number system of choice for bit flags and setting and checking bits was something I did every day - especially when I was doing OS development those many years ago. I was as likely to want to know an ASCII code in Octal as Decimal or Hex when looking at data dumps. Today I still enjoy my binary coded decimal clock. In spite of the fact (or perhaps because of the fact) that it drives my wife crazy when I point to it and ask her what time it is. There was a time when she could have told me faster than I can but she's a little out of practice.
These days it seems as though Binary, Hex and especially Octal have fallen into disuse. Memory is cheap so people fell ok using a whole word to serve as a Binary flag. Sure there are people doing programming in C/C++ where they occasionally look at raw data but how many recognize that "20" means a space? Or maybe I'm wrong and people are teaching it. But are they teaching it as fun?
I've never been a math geek. Sad but true. Still learning binary and octal and hex (and more) gave me an appreciation for how numbers really worked. Just like learning different natural languages helps people understand how their own works I found that learning these other systems gave me a deeper understanding of Decimal. It was a great thing to learn at an early age. Its a shame that it doesn't seem to be part of elementary school math anymore.
I don't see many people getting down to the bits and bytes anymore - especially not to the bits. Are the days when one needs to know the bits gone or are they still around? Does understanding Binary (at least) still add an important component to a liberal computer science education? What do you think?
[Welcome Oregon Ducks from CIS 210 - if you are interested in a possibly interesting binary number project check out Binary Number Game.]
try to look at some of the question posted on comp.embedded or comp.dsp and you will see by yourself ;)
An interesting motivation for Bin/Hex is the way that I teach indexing in Arrays.
I feel that its really important that students understand that an index is not a magical number in memory that the computer jumps to, but rather an offset value for the size of your data. I require that my intro students, if they are given the memory address (in hex) of the start of the memory allocated for the array, and the size of each element of the array, that they can calculate the exact memory address of the nth element. (generally where n is a given value and not a variable - I want them to actually do the multiplication and addition).
Understanding that representation is one of the critical ah-ahs in understanding computation. For example, learning that '0' and 0 are different. This only happens when character representation and the various number representations are grasped. (Scripting languages that hide this distinction by converting ASCII to number representations and back in a silent way don't help with this later on.)
If you ever use a debugger (and people want to do that as an avenue to understanding, unfortunately), expressions of internal representations are going to crop up.
Having said this about "ahas," the next one, I suppose is the indirection that relates to addressing and referencing. Even with built-in garbage collection, I think it is critical to know the difference between holding a reference and holding the (direct representation of the) thing itself.
I'd say studying binary numbers (and hex, but maybe not octal so much) is still absolutely essential to computer science. However, if you are just a user of computers or only programming them at a very high level, then I suppose they're not so important. Regardless, if you are interested in how computers work, studying these number systems is important.
As you say, if you just want to understand decimal numbers better -- and don't care about their applications in computers -- then studying them is of use. Of course you could then argue why does it have to be a "power of 2" base? Why not base 5 or base 3? What I would say to that is that binary is still worthy of study as a special case place value system. The algorithms for conversion to/from binary are a bit simpler than conversion between non-binary bases. There are some insights to be gained from that.
I still use hex for web colors (#FFCC99) as well as WPF (#00FFCC99), but the last time I used them with any kind of intensity was when I did ML on the Commdore 64 (home of the fabled nybble!). Every few years I come across the need for a bit mask, but for the most part I don't use them any more. Still nice to know them, though (bit masks can be quite cool, especially with bitwise booleans and using left shift/right shift (<</>>) to perform squares/roots in spite of the more common use of System.Math
These bases, especially hex and binary, are still in very widespread use in industry, so I sure as heck hope that they are still being taught in schools!
Binary / Hex is still very important in design. Helps with color as mentioned, and also with image sizing.
We are also going to need another generation of bit level geeks as we re-factor all of our technology using nano-tech and light instead electricity.
I agree with those above. Binary is still important and fundamental to understanding some of the big ideas of C.S.
My students seem to appreciate it when I "pull back the curtain" on binary numbers. Even at the programming level understanding that EVERYTHING...EVERYTHING is eventually represented in binary helps explain all kinds of things. For example, why the largest positive int is (2^31)-1...why IP address segments don't get higher than 255...or RGB color values.
Binary is essential.
Clint Rutkas here: Even though I think it will say I'm Coding4Fun.
I totally agree with Baker Franke too. Knowing why lots of stuff is 0 to 255 or 0 to 1023.
Hex is required for CSS and both binary and hex are required for a lot of hardware interaction. Right now a friend and myself for Maker Faire are arguing about the updating speed for an image that is getting send through a serial port.
I remember hearing my high school teachers a bit back saying they were going to stop teaching matrix math. Game development lives and dies on that matrix math.
Coding4Fun: Matrices are no longer taught in New Zealand high schools - they've added geometric transformations to the curriculum instead. A big loss for algebra at uni IMHO.
As a former maths teacher I think that we have to keep in mind that there is a trade-off in teaching between deep-narrow and shallow-wide. Do we introduce a lot of topics but not cover them in much depth, or do we narrow the topics covered and go into them with depth? Also, whilst binary/hex are useful for those who go on to do CS at uni and end up doing system-level programming, this is only a tiny percentage of the population - probably less than 0.1 percent in most countries - so such learning isn't critical to maths education.
CS education at high school level is essentially dead in New Zealand. Is this a bad thing? Possibly not, as any kid that is going to be serious about CS is probably going to be coding before they've even started high school.