Communication Theory–Information Theory–Two marks with answers

Anna University

147402: Communication Theory

SEM / YEAR: IV/ II

QUESTION BANK


UNIT-V

INFORMATION THEORY

PART A


1. Define information rate.

Ans:

If the time rate at which source X emits symbols is r symbols per second. The information rate

R of the source is given by R=r H(X) bits/second H(X)- entropy of the source .


2. Define entrophy?

Ans:

Entropy is the measure of the average information content per second. It is given by the expression H(X)=∑I P(xi)log2P(xi) bits/sample.


3. What is a prefix code?

Ans:

In prefix code, no codeword is the prefix of any other codeword. It is variable length code. The binary digits are assigned to the messages as per their probabilities of occurrence.


4. What is channel capacity of Binary Synchronous with error probability of 0.2?


5. Define mutual information.

Ans:

Mutual information I(X,Y) of a channel is defined by I(X,Y)=H(X)-H(X/Y) bits/symbol

H(X)- entropy of the source H(X/Y)- conditional entropy of Y.


6. State Shanon Hartley theorem.

Ans:

The capacity ‘C’ of a additive Gaussian noise channel is

C=B log2(1+S/N)

B= channel bandwidth ,S/N=signal to noise ratio.


7. State any four properties of entropy.

Ans:

1.I(X,Y)=I(Y,X)

2.I(X,Y)>=0

3.I(X,Y)=H(Y)-H(Y/X)

4.I(X,Y)=H(X)+H(Y)-H(X,Y).


8. Give the expressions for channel capacity of a Gaussian channel.

Ans:

Channel capacity of Gaussian channel is given as, C = B log2 (1 + (S/N))


9. Define the entropy for a discerte memory less source.

Ans:

The entropy of a binary memory-less source H(X)=-p0log2p0-(1-p0)log2(1-p0) p0- probability of symbol ‘0’,p1=(1- p0) =probability of transmitting symbol ‘1’.


10. Define the significance of the entropy H(X/Y) of a communication system where X is he transmitter and Y is the receiver.

Ans:

H(X/Y) is called conditional entrophy. It represents uncertainty of X, on average, when Y is known.In other words H(X/Y) is an average measure of uncertainty in X after Y is received.

H(X/Y) represents the information lost in the noisy channel.


11. An event has a six possible outcomes with probabilities 1/2, 1/4, 1/8, 1/16, 1/32, 1/32. Find the entropy of the system.

Ans:

H = ∑Pk log2(1/Pk)

= (½) log2 2 + (¼) log2 4 + (1/16) log2 16 + (1/32) log2 32 + (1/32) log2 32

= 1.5625.


12. When is the average information delivered by a source of alphabet size 2, maximum?

Ans:

Average information is maximum, when the two messages are equally likely i.e., p1 = p2 = 1/2. Then the maximum average information is given as,

Hmax = 1/2 log2 2 + 1/2 log2 2 = 1 bit / message.


13. Name the source coding techniques.

Ans:

1. Prefix coding

2. Shanon-fano coding

3. Huffman coding.


14. Write down the formula for mutual information.

Ans:

The mutual information is defined as the amount of information transferred when Xi is transmitted Yj is received. It is represented by I( Xi , Yj ) and it is given as,

I(Xi,Yj) = log ( P(Xi/Yj)/ P(Xi) ) bits.


15. Write the expression for code efficiency in terms of entropy.

Ans:

Redundancy = 1 - code efficiency. Redundancy should be as low as possible.


16. Is the information of a continuous system non negative? If so, why?

Ans:

Yes,the information of a continuous system is non- negative. The reason is that I(X;Y)>= 0 is one of its property.


1 comment:

  1. question 11 is wrong
    in calculation 1/8 is not considered

    ReplyDelete