0% found this document useful (0 votes)
85 views

SRM Institute of Science & Technology

1) This document contains an exam on information theory and coding from SRM Institute of Science & Technology. 2) The exam contains 3 parts - Part A with 10 multiple choice questions, Part B with 4 short answer questions, and Part C with 2 long answer questions. 3) The questions cover topics like convolutional codes, channel capacity, entropy, Shannon-Fano coding, and the Viterbi algorithm.

Uploaded by

hahemeb608
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
85 views

SRM Institute of Science & Technology

1) This document contains an exam on information theory and coding from SRM Institute of Science & Technology. 2) The exam contains 3 parts - Part A with 10 multiple choice questions, Part B with 4 short answer questions, and Part C with 2 long answer questions. 3) The questions cover topics like convolutional codes, channel capacity, entropy, Shannon-Fano coding, and the Viterbi algorithm.

Uploaded by

hahemeb608
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 4

SRM Institute of Science & Technology

College of Engineering & Technology


Ramapuram Campus
Department of Electronics and Communication Engineering
Academic Year (2021-2022)
Continuous Learning Assessment – III IV ECE - A, B, C & D Date :
18ECE225T – Information Theory and Coding Semester : VII
Max Marks : 50 2018 Regulations Duration : 1 ½ hrs

Apply convolution codes for performance analysis & cyclic codes for error detection and
CLO -4
correction.
CLO -5 Design the channel performance using Information theory
CLO -6 Analyse any type of channel and select coding techniques to improve channel performance

Part A (Answer all the questions ) 10 x 1 = 10 M


Bloom
Q No CLO Question
s Level
When the message bits arrive serially that causes the use of buffer as
undesirable, which of the following codes are preferred to handle it?
a) Block Codes
1 4 2 b) Cyclic Codes
c) Convolutional Codes
d) Repetition Codes
An encoder with constraint length of K=3 has ________ memory
elements.
a) 2
2 4 2 b) 4
c) 7
d) 8
Each branch of tree corresponding to input 1 specifies __________
branch and each branch of tree corresponding to input 0 specifies
__________ branch.
3 4 2 a) upward, downward
b) downward, upward
c) upward, upward
d) downward, downward
The central portion of the trellis exhibits a ________ periodic structure.
a) Large
4 4 2 b) Repetitive
c) Fixed
d) Interchangeable
The decoder that produces most likely codeword as its output when all
input messages are equally likely is known as ____________.
a) Maximum likelihood decoder
5 4 2 b) Sequential decoder
c) Syndrome decoder
d) Minimum distance decoder
6 5 2 The maximum entropy of a source that produces 16 symbols with equal
probability is _______ bits/symbol.
a) 16
b) 4
c) 1.2
d) 8
A source emits 3 characters with probabilities of 0.25, 0.35 and 0.4
respectively, calculate the source entropy?
a) 2.93 bits/symbol
7 5 2
b) 2.62 bits/symbol
c) 2.45 bits/symbol
d) 1.55 bits/symbol
Shannon-Fano method is not suitable when __________ is increased.
a) Entropy
8 5 2 b) Number of messages
c) Efficiency
d) Redundancy
Given a BSC with capacity 0.25 and transinformation of 0.21, obtain the
redundancy of the channel.
a) 0.84
9 6 2
b) 0.16
c) 1.84
d) 1.68
Determine the receiver probability values p(y1) and p(y2) for the channel
given below if p(x1)= 0.6.

10 6 2

a) p ( y )  0.48, p ( y )  0.12
1 2

b) p ( y )  0.48, p( y )  0.12
1 2

c) p( y )  0.6, p ( y )  0.4
1 2

d) p ( y )  0.48, p( y )  0.28
1 2
Part B
Answer any 4 of the following 4 x 4 = 16 M
Q.No Bloom Question
CLO
s Level
11 4 2 What is a turbo code? Outline its block diagram.
12 4 2 What is meant by constraint length of a convolutional encoder?
13 4 2 Explain the performance characteristics of sequential decoding.
A quaternary source generates information with probabilities 0.1, 0.2, 0.3
14 5 2 and 0.4. Find the entropy and maximum entropy of the entropy.
15 5 2 Find out the channel matrix for the channel shown below
What are the conditions of Shannon’s theorem? Why it is important in
16 6 2
information theory?

Part C
2 x 12 = 24
Q.No Bloom Question
CLO
s Level
Obtain state table, state diagram and construct Trellis Structure, find the
encoded bit for the given message bit = 1 0 0 1 1

17.a 4 2
Input

(OR)
17.b 4 2 Describe the implementation of Viterbi Algorithm with an example.
Find the mutual information for the channel shown below.

18.a 5 2

(OR)
Apply the Shannon-Fano coding procedure for the following message
ensemble:
[X] x1 x2 x3 x4 x5 x6 x7 x8
18.b 6 2 [P]

Calculate the entropy and efficiency of the code.

Prepared by
Ms.Srisabarimani K
Assistant Professor [S.G] – ECE Department

Part B
Answer any 4 of the following 4 x 4 = 16 Marks
11. What is a turbo code? Outline its block diagram.
12. What is meant by constraint length of a convolutional encoder?
13. Explain the performance characteristics of sequential decoding.
14. A quaternary source generates information with probabilities 0.1, 0.2, 0.3 and 0.4. Find the
entropy and maximum entropy of the entropy.
15. Find out the channel matrix for the channel shown below

16. What are the conditions of Shannon’s theorem? Why it is important in information theory?
Part C
17.(a) Obtain state table, state diagram and construct Trellis Structure, find the encoded bit for the
given message bit = 1 0 0 1 1

(OR)
17.(b) Describe the implementation of Viterbi Algorithm with an example.
18.(a) Find the mutual information for the channel shown below.

(OR)
18.(b) Apply the Shannon-Fano coding procedure for the following message ensemble:
[X] x1 x2 x3 x4 x5 x6 x7 x8
[P]

Calculate the entropy and efficiency of the code.

You might also like