SRM Institute of Science & Technology
SRM Institute of Science & Technology
Apply convolution codes for performance analysis & cyclic codes for error detection and
CLO -4
correction.
CLO -5 Design the channel performance using Information theory
CLO -6 Analyse any type of channel and select coding techniques to improve channel performance
10 6 2
a) p ( y ) 0.48, p ( y ) 0.12
1 2
b) p ( y ) 0.48, p( y ) 0.12
1 2
c) p( y ) 0.6, p ( y ) 0.4
1 2
d) p ( y ) 0.48, p( y ) 0.28
1 2
Part B
Answer any 4 of the following 4 x 4 = 16 M
Q.No Bloom Question
CLO
s Level
11 4 2 What is a turbo code? Outline its block diagram.
12 4 2 What is meant by constraint length of a convolutional encoder?
13 4 2 Explain the performance characteristics of sequential decoding.
A quaternary source generates information with probabilities 0.1, 0.2, 0.3
14 5 2 and 0.4. Find the entropy and maximum entropy of the entropy.
15 5 2 Find out the channel matrix for the channel shown below
What are the conditions of Shannon’s theorem? Why it is important in
16 6 2
information theory?
Part C
2 x 12 = 24
Q.No Bloom Question
CLO
s Level
Obtain state table, state diagram and construct Trellis Structure, find the
encoded bit for the given message bit = 1 0 0 1 1
17.a 4 2
Input
(OR)
17.b 4 2 Describe the implementation of Viterbi Algorithm with an example.
Find the mutual information for the channel shown below.
18.a 5 2
(OR)
Apply the Shannon-Fano coding procedure for the following message
ensemble:
[X] x1 x2 x3 x4 x5 x6 x7 x8
18.b 6 2 [P]
Prepared by
Ms.Srisabarimani K
Assistant Professor [S.G] – ECE Department
Part B
Answer any 4 of the following 4 x 4 = 16 Marks
11. What is a turbo code? Outline its block diagram.
12. What is meant by constraint length of a convolutional encoder?
13. Explain the performance characteristics of sequential decoding.
14. A quaternary source generates information with probabilities 0.1, 0.2, 0.3 and 0.4. Find the
entropy and maximum entropy of the entropy.
15. Find out the channel matrix for the channel shown below
16. What are the conditions of Shannon’s theorem? Why it is important in information theory?
Part C
17.(a) Obtain state table, state diagram and construct Trellis Structure, find the encoded bit for the
given message bit = 1 0 0 1 1
(OR)
17.(b) Describe the implementation of Viterbi Algorithm with an example.
18.(a) Find the mutual information for the channel shown below.
(OR)
18.(b) Apply the Shannon-Fano coding procedure for the following message ensemble:
[X] x1 x2 x3 x4 x5 x6 x7 x8
[P]