information entropy love

information entropy

Definitions

from Wiktionary, Creative Commons Attribution/Share-Alike License

  • n. A measure of the uncertainty associated with a random variable ; a measure of the average information content one is missing when one does not know the value of the random variable (usually in units such as bits); the amount of information (measured in, say, bits) contained per average instance of a character in a stream of characters.

Etymologies

Sorry, no etymologies found.

Examples

Sorry, no example sentences found.

Wordnik is becoming a not-for-profit! Read our announcement here.

Comments

Log in or sign up to get involved in the conversation. It's quick and easy.