maintitle.gif (3356 bytes)
menu_intro.gif (1886 bytes)
menu_bar_sf.GIF (1938 bytes)
 

Comparison of Huffman and Shannon-Fano Algorithms

Symbol

Count (c)

ASCII Bit Size (sa)

ASCII Bits (c ¥ sa)

Shannon-Fano Bit Size (ssf)

Shannon-Fano Bits (c ¥ ssf)

Huffman Size (shf)

Huffman Bits (c ¥ shf)

A

14

8

112

2

28

1

14

B

7

8

56

2

14

3

21

C

5

8

40

3

15

3

15

D

5

8

40

3

15

3

15

E

4

8

32

3

12

3

12

Total

280

84

77

 

Statistical Compressors
Concept
Algorithm
Example
Comparison (H vs. SF)


The adjustment in code size from the Shannon-Fano to the Huffman encoding scheme results in an increase of 7 bits to encode ‘B’, but a saving of 14 bits when coding the ‘A’ symbol, for a net savings of 7 bits.

In general, Shannon-Fano and Huffman coding will always be similar in size. However, Huffman coding will always at least equal the efficiency of the Shannon-Fano method, and thus has become the preferred coding method of its type (Nelson, 38).

 

back to top | home