Information theory is a branch of mathematics concerned with the storage of data and transmission of messages. People commonly use information theory in the design and operation of electronic information systems, such as radio, television, and computer networks.
Error correction is an important use of information theory. Messages are susceptible to errors and distortions called noise. Examples of noise include radio static, scratches on a DVD, and typographical errors in a sentence. Experts use information theory to develop codes and other techniques to compensate for noise.
A simple example involves the passing of a note between friends. The intended message is 908, but in passing, the note may be smudged, changing the numbers: the 9 becomes a 4 or the 8 becomes a 3. The smudges are an example of noise, which damages messages by changing their contents. To safeguard the message, the friends might pass three copies of the note or write 908 three times on a single note. By duplicating the message three times, one smudged number is “outvoted” by two unsmudged numbers. Alternatively, the message can be written out as nine-zero-eight. Each of these solutions improves reliability by sacrificing efficiency, a common trade-off in information theory.
Information theory is used in many ways. Biologists use it to analyze genetic mutations. Business managers and educators use it to study how people communicate. Information theory is also used in data compression and cryptography (the study and use of codes).
The British logician and mathematician George Boole discovered concepts important to information theory as early as the mid-1800’s. The formal study of information theory began with the 1948 publication of the paper “A Mathematical Theory of Communication” by the American scientist Claude Shannon.