AN2121 Freescale Semiconductor / Motorola, AN2121 Datasheet - Page 8

no-image

AN2121

Manufacturer Part Number
AN2121
Description
JPEG2000 Arithmetic Encoding on StarCore SC140
Manufacturer
Freescale Semiconductor / Motorola
Datasheet

Available stocks

Company
Part Number
Manufacturer
Quantity
Price
Part Number:
AN2121SC
Manufacturer:
TERIDIAN
Quantity:
40
Background Theory
2.1.2
In the above example, the code rate and the entropy are the same value, 1.75, because the probability for
each symbol is equal to 2
however, this is not true, and the limitations of Huffman coding become evident.
2.1.2.1
In Huffman coding, each code word must consist of at least one bit, which results in an average coding rate
of at least one bit per symbol. However, the theoretical minimum or entropy of a system can be less than
one bit per symbol, so in some cases it will be impossible for Huffman coding to attain the theoretical
minimum.
2.1.2.2
In the example, it has been assumed that the probabilities of the various symbols are fixed. In many cases,
however, the probability varies, so the average number of bits per symbol in a Huffman coding scheme
could be much larger than the entropy. For instance, if symbol e in the above example actually occurred
with a probability of say 0.5, but still had 3 bits in its code word, the output bit stream would average 3 bits
for half of the time rather than an eighth of the time as expected. Therefore, it would be preferable to be
able to adapt the coding system to update the probabilities in real-time, which requires some sort of
statistical estimation process. However, implementing the Huffman algorithm is a computationally
intensive process, so modifying the code words adaptively could be prohibitively expensive from a
computational point of view.
2.1.2.3
In practical implementations of the Huffman coder, a lookup table is usually used. The number of table
entries must be 2
length is 3, so the lookup table must contain 8 entries to account for all unused combinations of 3 digits
containing the relevant symbol with a shorter prefix. The bit stream in this case is examined 3 bits at a
time, with these bits providing the addressing to the table. For example, the table entry for 100 would be
(treating the right hand bit as the next input, i.e. the code stream input is read from right to left). In this
case, only one digit (the right-hand 0) is required to specify a particular code word, so only that digit is
dropped, and the next bit in the bit stream is concatenated to the remaining 10 to form the next lookup table
entry (which in this particular case, would be
Huffman algorithm does not limit the length of the code words, the table can grow very large.
These problems are partially addressed with the use of arithmetic coding.
2.2
For simplicity, the following discussion of arithmetic coding assumes a sequence of symbols in a
stationary random process with independent elements. In this case, arithmetic coding produces a rate
which approaches the entropy of the sequence, but also applies to correlated sequences if the process
remains stationary. This discussion does not describe how arithmetic coding adapts to changing statistics
in the bit stream because the adaptive nature of the coding is mainly determined by the coefficient bit
modeler within the JPEG2000 standard. A good tutorial on arithmetic coding can be found in reference [3].
4
Limitations of Huffman Coding
Minimum Obtainable Coding Rate
Varying Probability
Practical Implementation
Arithmetic Coding
L
, where L is the maximum length of the code words. In the above example, the maximum
JPEG2000 Arithmetic Encoding on the StarCore SC140
l
i
Freescale Semiconductor, Inc.
where, as before, l
For More Information On This Product,
Go to: www.freescale.com
c
i
is equal to the length of the ith symbol. In most cases,
again.) The problem here lies in the fact that because the
c

Related parts for AN2121