In 1991, MIPS Technologies, the company responsible for the first RISC architecture, produced the first commercially available 64-bit processor. The processor was initially used in SGI graphics workstations which ran a 64-bit version of the IRIX operating system. Intel followed suit in 1994, announcing its IA-64 architecture, to be jointly developed with HP. By 1996, Fujitsu’s HAL Computer Systems, Sun, and HP had all released 64-bit processors. Most of these used Solaris, IRIX, and other UNIX variants as operating systems. In 1999, AMD’s disclosed its 64-bit extensions to the IA-32, called x86-64.
In 2000, IBM released its 64-bit architecture, the zSeries z900, along with a new z/OS operating system. This combined package killed competitors lacking a robust compatibility solution. In 2001, Intel released Itanium, a 64-bit processor targeted at high-end servers, but quickly followed this up with Itanium 2 in 2002, which boasted triple the bus bandwidth as its predecessor.
Finally, in 2003 AMD came out with its 64-bit Opteron and Athlon. At the Worldwide Developer Conference, Steve Jobs also announced Apple’s release of personal computers with IBM and Motorola 64-bit chips. Apple described their computers as the “world’s fastest personal computers” to date.
Over the last year, several key announcements have been made:
It is interesting to note that the wave of dual core machines has arrived almost simultaneously with 64-bit desktop processors. This serves to further magnify the effects of 64-bit computing, as the machines that many users will inevitably buy will be enhanced in two distinct ways.
While 64-bit computing seems sufficient for most applications today, systems with 128 bit registers have been developed (most notably IBM’s System/370). These processors were introduced in the 1970s for high-end computing, and their use has been reserved to server machines to date.