Tuesday 16 December 2014

Should Humans Learn To Count in Hexadecimal?


Computers use a binary or base-2 counting system for reasons that are beyond this post. Since binary numbers are large to write, programmers adopted larger base systems that could directly represent the bits, namely octal (base-8,3 bits) and hexadecimal (base-16, 4 bits) with hexadecimal more commonly used. Two hexadecimal digits represent a byte and other computer words all have a number of bits that can be divided evenly into bytes.   For example, 32-bit address is 4 bytes or 8 hexadecimal digits. To write these extra digits, A-B-C-D-E-F is used, representing decimal the numbers 10-11-12-13-14-15 and to make the distinction clear numbers are often prefixed with 0x or #.
Decimal numbers, or base-10 is what  humans mostly use and while it seems logical when looking at our hands, unlike computers, using this system is arbitrary.  Humans can use any base system (see below). The result is that computers have to convert the base of numbers that are displayed to the users and also convert numbers that are entered by the user.   While this is not difficult, this extra step itself can create issues that need not be created.
One of these issues is rounding error.  In both systems 1/3 cannot be represented exactly, as the best that can be done in base-10 is 0.33333... and hexadecimal (and convertible to octal and binary), the number is 0x0.5555...  There are also numbers that can be represented in decimal but not hexadecimal.  For example, 0.1 is 0x109999....  While people easily understand the issues with base-10 figures, such as seeing a graph with three 33% and knowing they really add up to 100%, issues with this conversion are often not noticed or create odd numbers such as 0.9999 when people expect to see 1.
Another issue is that many things in computers are limited by the binary system but with base-10 the limitation seems arbitrary.  Take the RGB colour values used by many programs, including photo editors and CSS.  The user can only enter 0 to 255.  There seems no rational reason that they cannot change the third digit 255 to 256 unless the user realises that it really is a two digit hexadecimal number.  Then they would understand clearly that  0xFF + 0x1 = 0x100, a three digit number too big to fit is the two digit space.
A high point is that the cost of learning is minimal because people already have the tools to use hexadecimal numbers.  For example, to calculate 0x4D + 0xC5 using elementary school techniques:
  • Add 0xD + 0x5 to get 0x12
  • Carry the 0x1
  • Add 0x1 + 0x4 + 0xC and you get 0x11
  • Carry the 0x1 again and since it is the only number the first digit becomes 0x1
  • The answer is 0x112
While there is a necessity for new symbols and vocabulary, people also no longer using Roman Numerals (base-1 with compression?) and have also switched to the Metric system.  As computers are now the primary tool used in mathematical calculations, it seem only prudent that humans consider using the native numbering system of computing when doing math themselves.



Wednesday 3 December 2014

The Desktop Metaphor No Longer Makes Sense


The desktop metaphor is meant to covey to the user that the files and folders on their computer are similar to the files an folders on their desk.  In addition, the work that the user does on their desk is similar to how they work on a file when it is open in an application.  However, when introduced computers were very large (and often on the desk) and monitors were using CRT technology. Even notebooks took up a monitor sized footprint on the desk. As a result the computer was not just another work area, it actually replaced your physical desk with a virtual one.
In the last decade, CRTs were replaced by flat screen monitors, often more then one.  Because of their smaller size they leave useable space on the desk and many can even be directly mounted to a wall (if the desk is against a wall). Therefore, and especially with multiple large monitors, your virtual desktop is less like a desktop and more like an interactive wall above the desk with your physical desktop available for traditional, non-virtual uses.
As a result the concept of the desktop seems to be becoming less relatable then it once was and new form factors like the tablet are making this detachment even worse.   The virtual world seems to be expanding into the physical world as people can now walk around with windows that before were limited to their computer screen. The advent of the cloud and networking has also changed the ways people store and move files, with the expectation that they are simply everywhere.
What is needed is a successor to the desktop metaphor that can capture the new ways of working in this physical/virtual world but still be applicable when applied to the contemporary desktop.