from The American Heritage® Dictionary of the English Language, 4th Edition
- n. For a closed thermodynamic system, a quantitative measure of the amount of thermal energy not available to do work.
- n. A measure of the disorder or randomness in a closed system.
- n. A measure of the loss of information in a transmitted message.
- n. The tendency for all matter and energy in the universe to evolve toward a state of inert uniformity.
- n. Inevitable and steady deterioration of a system or society.
from Wiktionary, Creative Commons Attribution/Share-Alike License
- n. A measure of the amount of information and noise present in a signal. Originally a tongue in cheek coinage, has fallen into disuse to avoid confusion with thermodynamic entropy.
- n. The tendency of a system that is left to itself to descend into chaos.
from the GNU version of the Collaborative International Dictionary of English
- n. A certain property of a body, expressed as a measurable quantity, such that when there is no communication of heat the quantity remains constant, but when heat enters or leaves the body the quantity increases or diminishes. If a small amount, h, of heat enters the body when its temperature is t in the thermodynamic scale the entropy of the body is increased by h ÷ t. The entropy is regarded as measured from some standard temperature and pressure. Sometimes called the thermodynamic function.
from The Century Dictionary and Cyclopedia
- n. In physics: As used by Clausius, the inventor of the word, and others, that part of the energy of a system which cannot be converted into mechanical work without communication of heat to some other body, or change of volume.
- n. As used by Tait and others, the available energy; that part of the energy which is not included under the entropy in sense .
from WordNet 3.0 Copyright 2006 by Princeton University. All rights reserved.
- n. (communication theory) a numerical measure of the uncertainty of an outcome
- n. (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work
A typical rebuttal is to suggest that information entropy and thermodynamic entropy are unconnected, citing the apocryphal story that Shannon picked the term entropy because “nobody understands what it means”.
In Über die bewegende Kraft der Wärme (1865) he introduced the term entropy, stating that the entropy of the universe tends to increase.
Rudolf J.E. Clausius also introduced (1850) a quantitative measure of irreversibility which he termed entropy, and he posited the so-called second law of thermodynamics by which for a closed physical system the total entropy of the system cannot decrease in time but only increase or at most remain constant.
Fork in the Road, is entirely about his electric car, which he calls the entropy said: "This video made me smile and be happy, it's great to see people and robots getting along! ..." idontlikewords said: "I think what the author is getting at by bringing up the difference between an advertising medium an ..."
It didn’t help that von Neuman and Shannon started using the term entropy for a formula in information theory that looked a lot like Boltzmann’s expression for entropy.
Just as entropy is only a relative background effect, it is also not all that there is to reality.
Social entropy is a similar concept applied to people - we have a tendency towards anarchy unless society applies extensive energy (laws, police, Dick Cheney, etc.) to rein us in.
When it reaches that state, maximum entropy is achieved.
With the advent of quantum mechanics in the 1920s (see Section 3.4), Eyring developed his transition-state theory in 1935 and this showed that the activation entropy is also important.
This entropy is useful for all your random number needs.