Monday, August 22, 2005

Information Entropy Notes (Wikipedia)

Notes on Information Entropy from Wikipedia.

Entropy can be defined as the "amount of information carried in a signal". A "signal" can be a string of characters. Alternatively, entropy is the "amount of randomness in a signal or a random event". This concept was proposed by Shannon in the paper "A Mathematical Theory of Communication".
As per Shannon's definition of entropy, entropy is the "minimum channel capacity required to reliably transmit the source as encoded binary digits" (thus the base2-log of the probability of the random event in the math formula). Entropy is the "mathematical expectation of the amount of information carried in a digit fom the information source". It is also the "measure of the uncertainty about the 'realization' of a random variable". For a data source, entropy is the average number of bits per symbol needed to encode it.
Examples:
  • Fair coin flip - entropy is 1 bit per flip?
  • If a system generates the same output then its entropy is 0
  • entropy of English text is about 1.5 bits per character

0 Comments:

Post a Comment

<< Home