|
||||||||||
Huffman CodingCoding
Huffman Coding
1. Initialization: Put all the symbols in an OPEN list, keep it sorted at all
times (e.g., ABCDE). In our case the symbols will not be letters but,
actually number representing the greylevel or color values of the pixels in an
image 2. Create the bottom of a tree structure and assign each element in OPEN to a node at this level of the tree. 3. Repeat until the OPEN list has only one node left: (a) From OPEN pick two nodes having the lowest frequencies/probabilities,
create a parent node of them.
The resulting Codebook (assignment of codes to input symbols)
Symbol Count log(1/p) Code Subtotal (# of bits)
------ ----- -------- --------- --------------------
A 15 1.38 0 15
B 7 2.48 100 21
C 6 2.70 101 18
D 6 2.70 110 18
E 5 2.96 111 15
TOTAL (# of bits): 87
Discussions:
In the above example:
entropy = (15 x 1.38 + 7 x 2.48 + 6 x 2.7 + 6 x 2.7 + 5 x 2.96) / 39
= 85.26 / 39 = 2.19
(Entropy is a measure of information)
Average Number of bits needed for Human Coding is: 87 / 39 = 2.23
Adaptive Huffman CodingMotivations:(a) The previous algorithms require the statistical knowledge which is often
not available (e.g., live audio, video). The solution is to use adaptive algorithms. As an example, the Adaptive Huffman Coding is examined below. The idea is however applicable to other adaptive compression algorithms.
ENCODER DECODER
------- -------
Initialize_model(); Initialize_model();
while ((c = getc (input)) != eof) while ((c = decode (input)) != eof)
{ {
encode (c, output); putc (c, output);
update_model (c); update_model (c);
} }
Summary
|
||||||||||
| © Lynne Grewe |