information entropy love

information entropy


from Wiktionary, Creative Commons Attribution/Share-Alike License

  • n. A measure of the uncertainty associated with a random variable ; a measure of the average information content one is missing when one does not know the value of the random variable (usually in units such as bits); the amount of information (measured in, say, bits) contained per average instance of a character in a stream of characters.


Sorry, no etymologies found.


    Sorry, no example sentences found.


Log in or sign up to get involved in the conversation. It's quick and easy.