Entropy
In classical mechanics, entropy is a measure of the disorder in a system.
Similarly, information is measured by the *change*
in uncertainty in a system. Entropy, as defined by Shannon, is the uncertainty regarding
which symbols are chosen from a set of symbols with given *a priori* probabilities.
Since information is a decrease in uncertainty, we may regard entropy as the information
required to construct the correct set of symbols. If there is more disorder, or entropy,
then more information is required to reconstruct the correct set of symbols. Entropy and
information are used interchangeably in Information Theory, although they do not always
mean the same thing. Entropy in a source is equal to the information per symbol needed to
reconstruct its output, and is given by H(p). However entropy in a channel *decreases* its information throughput.
Entropy, as illustrated, can either support or obstruct our efforts to send
information depending on where it occurs. |