information theory love

information theory

Definitions

from The American Heritage® Dictionary of the English Language, 5th Edition.

  • noun The theory of the probability of transmission of messages with specified accuracy when the bits of information constituting the messages are subject, with certain probabilities, to transmission failure, distortion, and accidental additions.

from the GNU version of the Collaborative International Dictionary of English.

  • noun (Math., Telecommunications) The science which studies the capacity of systems to contain and transmit information{2}, and the factors such as noise and channel capacity that may affect the rate or accuracy of information transmission and reception.

from Wiktionary, Creative Commons Attribution/Share-Alike License.

  • noun mathematics A branch of applied mathematics and engineering involving the quantification of information sent over a communication channel, disregarding the meaning of the sent messages, exemplified by the noisy-channel coding theorem.

from WordNet 3.0 Copyright 2006 by Princeton University. All rights reserved.

  • noun (computer science) a statistical theory dealing with the limits and efficiency of information processing

Etymologies

Sorry, no etymologies found.

Support

Help support Wordnik (and make this page ad-free) by adopting the word information theory.

Examples

    Sorry, no example sentences found.

Comments

Log in or sign up to get involved in the conversation. It's quick and easy.

  • "Claude Shannon was the founding father of information theory. For a hundred years after the electric telegraph, other communication systems such as the telephone, radio, and television were invented and developed by engineers without any need for higher mathematics. Then Shannon supplied the theory to understand all of these systems together, defining information as an abstract quantity inherent in a telephone message or a television picture. Shannon brought higher mathematics into the game.

    When Shannon was a boy growing up on a farm in Michigan, he built a homemade telegraph system using Morse Code. Messages were transmitted to friends on neighboring farms, using the barbed wire of their fences to conduct electric signals. When World War II began, Shannon became one of the pioneers of scientific cryptography, working on the high-level cryptographic telephone system that allowed Roosevelt and Churchill to talk to each other over a secure channel. Shannon’s friend Alan Turing was also working as a cryptographer at the same time, in the famous British Enigma project that successfully deciphered German military codes. The two pioneers met frequently when Turing visited New York in 1943, but they belonged to separate secret worlds and could not exchange ideas about cryptography."

    --from "How we Know," Freeman Dyson's review of James Gleick's The Information: A History, A Theory, a Flood in The New York Review of Books

    February 24, 2011