Exam 4: Lossless Compression Algorithms
Exam 1: Multimedia Authoring, Tools, Graphics and Image Data Representations10 Questions
Exam 2: Color in Image and Video11 Questions
Exam 3: Fundamental Concepts in Video and Basics of Digital Audio13 Questions
Exam 4: Lossless Compression Algorithms11 Questions
Exam 5: Lossy Compression Algorithms, Image Compression Standards and Basic Video Compression Techniques10 Questions
Exam 6: Mpeg Video Coding I -- Mpeg-1 and 2, Basic Audio Compression Techniques, Mpeg Audio Compression, Computer and Multimedia Networks7 Questions
Exam 7: Multimedia Network Communications and Applications, Wireless Networks and Content-Based Retrieval in Digital Libraries52 Questions
Select questions type
Consider an alphabet with three symbols , with probability . and .
State how you would go about plotting the entropy as a function of and . Pseudocode is perhaps the best way to state your answer.
Free
(Essay)
4.9/5
(33)
Correct Answer:
matlab:
If the source string MULTIMEDIA is now encoded using Arithmetic coding, what is the codeword in fractional decimal representation. How many bits are needed for coding in binary format? How does this compare to the entropy?
Free
(Essay)
4.7/5
(38)
Correct Answer:
We need bits to code the sequence of symbols, where is the range for the last coded symbol. Here, is , so the result is then computed as and the average bits per symbol equals 2.921928095 . We note that this is exactly equal to the entropy of the source, so in this case arithmetic coding provides optimal performance.
Construct a binary Huffman code for a source with three symbols and , having probabilities , and 0.1 , respectively. What is its average codeword length, in bits per symbol? What is the entropy of this source?
(b) Let's now extend this code by grouping symbols into 2-character groups - a type of VQ. Compare the performance now, in bits per original source symbol, with the best possible.
Fig. 7.1:

Free
(Essay)
4.7/5
(34)
Correct Answer:
(a) A: 1; B: 00; C: 01;
Average bits/symbol
Entropy sum
(b) Tree is per Fig. 7.1.
Codeword bitlengths are:
Average
The best possible is the entropy, given by:
For the LZW algorithm, assume an alphabet a b o d of size 5 from which text characters are picked.
Suppose the following string of characters is to be transmitted (or stored):
The coder starts with an initial dictionary consisting of the above 5 characters and an index assignment, as in the following table:
Make a table showing the string being assembled, the current character, the output, the new symbol, and the index corresponding to the symbol, for the above input - only go as far as "d a b b a d a b b a * d".

(Essay)
4.9/5
(30)
Consider a text string containing a set of characters and their frequency counts as follows: A: (15), B: (7), C: (6), D: (6) and E: (5). Show that the Shannon-Fano algorithm produces a tree that encodes this string in a total of 89 bits, whereas the Huffman algorithm needs only 87 bits.
(Essay)
4.8/5
(40)
Suppose we have a source consisting of two symbols, and , with probabilities and .
(a) Using Arithmetic Coding, what are the resulting bitstrings, for input and ?
(b) A major advantage of adaptive algorithms is that no a priori knowledge of symbol probabilities is required.
(1) Propose an Adaptive Arithmetic Coding algorithm and briefly describe how it would work. For simplicity, let's assume both encoder and decoder know that exactly symbols will be sent each time.
(2) Assume the sequence of messages is initially , followed by the sequence is .
Show how your Adaptive Arithmetic Coding algorithm works, for this sequence, for encoding: (i) the first , (ii) the second , and (iii) the last .
(Essay)
4.7/5
(34)
Calculate the entropy of a "checkerboard" image in which half of the pixels are BLACK and half of them are WHITE.
(Essay)
4.8/5
(44)
Consider the question of whether it is true that the size of the Huffman code of a symbol with probability is always less than or equal to .
As a counter example, let the source alphabet be . Given and , construct a Huffman tree. Will the tree be different for different values for and ? What conclusion can you draw?
(Essay)
4.8/5
(36)
Using the Lempel-Ziv-Welch (LZW) algorithm, encode the following source symbol sequence: MISSISSIPPI\#
Show initial codes as ASCII: e.g., "M". Start new codes with the code value 256.
(b) Assuming the source characters (ASCII) and the dictionary entry codes are combined and represented by a 9-bit code, calculate the total number of bits needed for encoding the source string up to the \# character, i.e. just MISSISSIPPI. What is the compression ratio, compared to simply using 8-bit ASCII codes for the input characters?
(Essay)
4.9/5
(42)
Is the following code uniquely decodable?
1 \mapstoa 2 \mapstoaba 3 \mapstobab
(Essay)
4.9/5
(43)
Suppose we wish to transmit the 10-character string "MULTIMEDIA". The characters in the string are chosen from a finite alphabet of 8 characters.
(a) What are the probabilities of each character in the source string?
(b) Compute the entropy of the source string.
(c) If the source string is encoded using Huffman coding, draw the encoding tree and compute the average number of bits needed.
(Essay)
4.9/5
(29)
Filters
- Essay(0)
- Multiple Choice(0)
- Short Answer(0)
- True False(0)
- Matching(0)