aaaaaaaaaaaaaaaaaaaaaaaaaa The deep space probe
aaaaaaaaaaaaaaaaaaaaaaaaaa The modem
aaaaaaaaaaaaaaaaaaaaaaaaaa The personal computer

The deep space probe - a whisper in the dark

Thousands, and even millions, of miles away from Earth, a small space probe orbits around a neighboring planet. This high-tech "camera" takes pictures of barren landscapes, huge craters, dense clouds of gas, and of the starry sky beyond human reach. And from thousands, and even millions, of miles away, these pictures come back in a clear and accurate format. However, these space probes are not the product of anything from this decade or the previous one. Deep space probes like the Mariner, which took pictures of the Martian landscape, where launched in the late 60's and early 70's. Before the Internet, cell phones, and digital networks, space probes where able to send and receive information from Earth in very precise ways. But how? How can a probe accomplish this task with less power than what is found in a dimly lit desk lamp?

The answer comes in advanced error correcting techniques. Since the 1950's work has still been continuing on a challenge first posed by Shannon. He had proven that it was possible to send information in the presence of noise with very little error. The question that information theorists have been trying to answer is how to accomplish it. In the 1960's, a new coding technique called the Reed-Muller code shed light on how to achieve such a feat. It was this code and the later Reed-Solomon code which made it possible for space probes to send information millions of miles away from Earth and is used currently in CD's, DVD's, and computer equipment.

The Reed-Muller code was an improvement on the Hamming code of the 1950's. Richard Hamming devised a code which would be able to correct one error for every seven bits (a zero or one) of code sent. The basis of the Hamming code was that for every seven bits of information, three of them would be "check-bits." Through some advanced mathematics and a little algebra, these check-bits could help find and correct errors in codes. The Reed-Muller code improved this correction by finding and correcting 7 errors in 32 bits of code. Unfortunately, this error-correction came at the cost of 6 data bits and 26 check-bits for every block of information. However, when probes like the Mariner launched, the Reed-Muller code provided what was necessary to send back pictures and other data reliably back to Earth. Although faint, these signals can be decoded with greater certainty than before thanks to these information theorists. At a rate of 16,000 bits/sec, scientists now had the power to extract data from the far reaches of the universe. Today, probes and rovers like the Pathfinder on Mars send information using very similar formats as the one's found in the early 60's.

The modem

How the 56K Modem Works

For a long time people thought that the fastest speed a modem could achieve using an analog phone line would be 33 Kbits/sec. If one wanted to move any faster then he/she would have to use a cable modem or have some sort of Ethernet connection. People appealed to Shannon's law of maximum channel capacity reasoning that it would be impossible to achieve any rates above 33 Kbits/sec. After all, how can you argue against a proof that has held true for a half century? The question is then, "how are new modems able to communicate at speeds of 56 Kbits/sec over analog phone lines when Shannon's law mathematically forbids it?"

The answer to this question is that Shannon's law does not actually forbid the transmission of data at 56 Kbits/sec. No laws of information theory are being broken and Shannon's proof is still safe from any harsh criticism. The trick comes in the way communication networks are configured. Shannon's law arrives at the maximum channel capacity by taking into account the bandwidth and the noise of a given channel. In the past, this is how data flowed between two modems:

1. Digital signals from the computer are decoded into analog electrical currents and transmitted across phone lines.
2. The analog, electrical currents are encoded into digital signals at a central office server.
3. The digital signals are then decoded back into analog currents and sent over another phone line.
4. The analog signal is then received by the target modem and encoded into a digital signal to be used by that computer.

The problem with this model is that there are several switches between analog and digital signals. A lot of noise is introduced into the system during the analog-to-digital conversion (this is called quantization noise). So, according to Shannon's law, one can only hope to achieve a maximum speed of 35 Kbits/sec because the channel has a lot of noise and data moving through it.

The 56 Kbits/sec modems use the fact that many ISP's (Internet Service Providers) use digital data lines when retrieving information from the Internet. The information comes directly from the ISP in digital format and is converted to analog before going through the modem. This process eliminates the need for the analog-to-digital conversion; hence, there is less noise on the channel. When the Shannon laws are applied to this new configuration, one where less noise is introduced into the phone line, it is indeed possible to achieve 56 Kbits/sec while downloading data. Of course, when uploading information to the ISP over a telephone line, the limit is still 33 Kbits/sec because of the costly analog-to-digital conversion that must made. However, there is research currently underway that is trying to achieve 56 Kbits/sec in the upstream direction through more feats of engineering trickery.

IT's greatest application: The personal computer

While there is little doubt that Claude Shannon's work is the cornerstone of modern communication, it also has a profound impact on the world of computing. The model for the modern computer shares a close relationship with Shannon's model of communication. After all, a computer is just one more mode of communication used to transmit and receive information.

An important, fundamental concept of computers is rooted in Shannon's model of communication. In his paper, "A Mathematical Theory of Communication," Shannon said that information could be measured when it is stripped of any meaning. Since meaning is inherently a sociological concept, it has little value in the world of science. Shannon's model of communication just took in a set of values as input and turned out another set of values as output. His model left no room for interpretation of the message because it was not important when transmitting it across a channel. Modern computers operate in the exact same way. They just receive inputs from the keyboard, mouse, microphone, and any other device and produce output in the form of pictures, text, sound, and video. However, the computer does not know what it is transmitting because it is nothing more than the communication channel proposed by Shannon fifty years ago. Any information that can be reduced into a binary sequence can surely be channeled by a computer. This includes information in the form of video, audio, text, and motion.

Another major concept is the fact that a computer used binary. When stripped down to its core features, a computer does nothing more than turn sequences of currents on and off. All pictures, sounds, and videos on a computer can all be reduced to a zero or a one. Words that are commonplace for computing like "bit" all stem from Shannon's idea of binary sequences to represent information.

Many other things done on computers owe a great deal to Shannon and other information theorists. Data compression is possible to people like David A. Huffman who came up with a method of representing information in the least amount of coding digits. Data compression has made it possible to create large and complex programs without exceedingly huge memory and storage requirements. It has also made it possible to listen to music on a computer in the form of "mp3" files. When viewed as a medium of communication, there is nothing that Shannon proved that does not apply to a computer. Noise, compression, and encryption are all large issues in computing and all find their roots in Claude Shannon and information theory.