Entropy

Because it is determined by the number of random microstates, entropy is related to the amount of additional information needed to specify the exact physical state of a system, given its macroscopic specification. For this reason, it is often said that entropy is an expression of the disorder, or randomness of a system, or of the lack of information about it. The concept of entropy plays a central role in information theory.https://en.wikipedia.org/wiki/Entropy

Entropy was last modified: August 11th, 2018 by Jovan Stosic

Leave a Reply