Next: The Shannon-Fano Algorithm
Up: Lossless Compression Algorithms (Entropy
Previous: Lossless Compression Algorithms (Entropy
According to Shannon, the entropy of an information source S is
where pi is the probability that symbol Si in S will occur.
- indicates the amount of information contained in Si, i.e., the number
of bits needed to code Si.
- For example, in an image with uniform distribution of gray-level
intensity, i.e. pi = 1/256, then the number of bits needed to code
each gray level is 8 bits. The entropy of this image is 8.
- Q: How about an image in which half of the pixels are white (I = 220)
and half are black (I = 10)?