Noun
- 1. information, selective information, entropy, information measure
- usage: (communication theory) a numerical measure of the uncertainty of an outcome; "the signal contained thousands of bits of information"
- 2. randomness, entropy, S, physical property
- usage: (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work; "entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity"
WordNet 3.0 Copyright © 2006 by Princeton University.
All rights reserved.Definition and meaning of entropy (Dictionary)