ITC Notes 2
ITC Notes 2
McMillan inequality, Source coding theorem, Shannon-Fano coding, Huffman coding, Extended
Huffman coding - Joint and conditional entropies, Mutual information - Discrete memory less
channels – BSC, BEC – Channel capacity, Shannon limit
***************************************************************************************************
uncertain it would be, it means that we are trying to have an idea on the average content of the
information from the source of the event.
Entropy can be defined as a measure of the average information content per source symbol.
Where pi is the probability of the occurrence of character number i from a given stream of
characters and b is the base of the algorithm used. Hence, this is also called as Shannon’s
Entropy.
Conditional Entropy: The amount of uncertainty remaining about the channel input after
observing the channel output, is called as Conditional Entropy.
It is denoted by
Mutual Information :
Thisisassumedbeforetheinputisapplied
To know about the uncertainty of the output, after the input is applied, let us consider
Conditional Entropy, given that Y = yk
we come to know that the difference, i.e. H(x)−H(x∣y)must represent the uncertainty about the channel
input that is resolved by observing the channel output.
Denoting the Mutual Information as I(x;y),we can write the whole thing in an equation, as follows
a) Negative
b) Positive
c) Positive & Negative
d) None of the mentioned Ans: b
11.Types of compression
a. Lossless
b. Lossy.
c. both a and b
d. None of the aboveAns: C
12. What is significance of D- frames in video coding
a. They generate low resolution picture.
b. highly compressed technique.
c. They generate high resolution picture.
d. None of the aboveAns: A
13. MPEG coders are used for
a. compression of audio
b. compression of text
c. compression of audio and video
d. None of the aboveAns: A
14. B-frame is also known as
a. unidirectional.
b. B- De-compression technique
c. B- compression technique
d. bidirectional frame. Ans: D
15. I-frame is
a. It basically searches the frames.
b. It basically predicts the movement of objects.
c. It basically compress the movement of objects
d. None of the above.Ans: B
c. Both a and b
16. H.261 is
a. compression of audio
b. De-compression of audio
c. Video Compression standard
d. None of the aboveAns: A
Assignment
(1)Explain the terms (i) Self information (ii) Average information (iii) Mutual Information.
(2)Discuss the reason for using logarithmic measure for measuring the amount of information.
(3)Explain the concept of amount of information associated with message.
(4) A binary source emitting an independent sequence of 0’s and 1’s with probabilities p and (1p)
respectively. Plottheentropyofthesource.
(5) Explain the concept of information, average information, information rate and redundancy as
referred to information transmission.
(6)Let X represents the outcome of a single roll of a fair dice. What is the entropy of X?
(7)A code is composed of dots and dashes. Assume that the dash is 3 times as long as the dot and
has one-third the probability of occurrence. (i) Calculate the information in dot and that in a
dash; (ii) Calculate the average information in dot-dash code; and (iii) Assume that a dot lasts
for 10 ms and this same time interval is allowed between symbols. Calculate the average rate
of information transmission.
(8)What do you understand by the term extension of a discrete memory less source? Show that
the entropy of the nth extension of a DMS is n times the entropy of the original source.
(9)A card is drawn from a deck of playing cards. A) You are informed that the card you draw is
spade. How much information did you receive in bits? B) How much information did you
receive if you are told that the card you drew is an ace? C) How much information did you
receive if you are told that the card you drew is an ace of spades? Is the information content of
the message “ace of spades” the sum of the information contents of the messages ”spade” and
“ace”?
(10) The output of an information source consists OF 128 symbols, 16 of which occurs with
probability of 1/32 and remaining 112 occur with a probability of 1/224. The source emits
1000 symbols/sec. assuming that the symbols are chosen independently; find the rate of
information of the source.