What is information? Explore the surprising answer of American mathematician Claude Shannon, who concluded that information is the ability to distinguish reliably among possible alternatives. Consider why this idea was so revolutionary, and see how it led to the concept of the bit - the basic unit of information.
Accompany the young Claude Shannon to the Massachusetts Institute of Technology, where in 1937 he submitted a master's thesis proving that Boolean algebra could be used to simplify the unwieldy analog computing devices of the day. Drawing on Shannon's ideas, learn how to design a simple electronic circuit that performs basic mathematical calculations.
How is information measured and how is it encoded most efficiently? Get acquainted with a subtle but powerful quantity that is vital to the science of information: entropy. Measuring information in terms of entropy sheds light on everything from password security to efficient binary codes to how to design a good guessing game.
Probe the link between entropy and coding. In the process, encounter Shannon's first fundamental theorem, which specifies how far information can be squeezed in a binary code, serving as the basis for data compression. See how this works with a text such as Conan Doyle's The Return of Sherlock Holmes
Learn how some data can be compressed beyond the minimum amount of information required by the entropy of the source. Typically used for images, music, and video, these techniques drastically reduce the size of a file without significant loss of quality. See how this works in the MP3, JPEG, and MPEG formats.
One of the key issues in information theory is noise: the message received may not convey everything about the message sent. Discover Shannon's second fundamental theorem, which proves that error correction is possible and can be built into a message with only a modest slowdown in transmission rate.
Dig into different techniques for error correction. Start with a game called word golf, which demonstrates the perils of mistaking one letter for another and how to guard against it. Then graduate to approaches used for correcting errors in computer operating systems, CDs, and data transmissions from the Voyager spacecraft.
Twelve billion miles from Earth, the Voyager spacecraft is sending back data with just a 20-watt transmitter. Make sense of this amazing feat by delving into the details of the Nyquist-Shannon sampling theorem, signal-to-noise ratio, and bandwidth - concepts that apply to many types of communication.
The one-time pad may be in principle unbreakable, but consider the common mistakes that make this code system vulnerable. Focus on the Venona project that deciphered Soviet intelligence messages encrypted with one-time pads. Close with the mathematics behind public key cryptography, which makes modern transactions secure - for now.
Study the workings of our innermost information system: the brain. Take both top-down and bottom-up approaches, focusing on the world of perception, experience, and external behavior on the one hand versus the intricacies of neuron activity on the other. Then estimate the total information capacity of the brain.
Maxwell's demon has startling implications for the push toward ever-faster computers. Probe the connection between the second law of thermodynamics and the erasure of information, which turns out to be a practical barrier to computer processing speed. Learn how computer scientists deal with the demon.
Contrast Shannon's code- and communication-based approach to information with a new, algorithmic way of thinking about the problem in terms of descriptions and computations. See how this idea relates to Alan Turing's theoretical universal computing machine, which underlies the operation of all digital computers.
Algorithmic information is plagued by a strange impossibility that shakes the very foundations of logic and mathematics. Investigate this drama in four acts, starting with a famous conundrum called the Berry Paradox and including Turing's surprising proof that no single computer program can determine whether other programs will ever halt.
Enter the quantum realm to see how this revolutionary branch of physics is transforming the science of information. Begin with the double-slit experiment, which pinpoints the bizarre behavior that makes quantum information so different. Work your way toward a concept that seems positively magical: the quantum computer.
Survey the phenomenon of information from pre-history to the projected far future, focusing on the special problem of anti-cryptography - designing an understandable message for future humans or alien civilizations. Close by revisiting Shannon's original definition of information and ask, What does the theory of information leave out?""