This set of Digital Communications Multiple Choice Questions & Answers (MCQs) focuses on “Information and Coding”.

1. Self information should be

a) Positive

b) Negative

c) Positive & Negative

d) None of the mentioned

View Answer

Explanation: Self information is always non negative.

2. The unit of average mutual information is

a) Bits

b) Bytes

c) Bits per symbol

d) Bytes per symbol

View Answer

Explanation: The unit of average mutual information is bits.

3. When probability of error during transmission is 0.5, it indicates that

a) Channel is very noisy

b) No information is received

c) Both of the mentioned

d) None of the mentioned

View Answer

Explanation: When probability of error during transmission is 0.5 then the channel is very noisy and thus no information is received.

4. Binary Huffman coding is a

a) Prefix condition code

b) Suffix condition code

c) Prefix & Suffix condition code

d) None of the mentioned

View Answer

Explanation: Binary Huffman coding is a prefix condition code.

5. The event with minimum probability has least number of bits.

a) True

b) False

View Answer

Explanation: In binary Huffman coding the event with maximum probability has least number of bits.

6. The method of converting a word to stream of bits is called as

a) Binary coding

b) Source coding

c) Bit coding

d) Cipher coding

View Answer

Explanation: Source coding is the method of converting a word to stream of bits that is 0’s and 1’s.

7. When the base of the logarithm is 2, then the unit of measure of information is

a) Bits

b) Bytes

c) Nats

d) None of the mentioned

View Answer

Explanation: When the base of the logarithm is 2 then the unit of measure of information is bits.

8. When X and Y are statistically independent, then I (x,y) is

a) 1

b) 0

c) Ln 2

d) Cannot be determined

View Answer

Explanation: When X and Y are statistically independent the measure of information I (x,y) is 0.

9. The self information of random variable is

a) 0

b) 1

c) Infinite

d) Cannot be determined

View Answer

Explanation: The self information of a random variable is infinity.

10. Entropy of a random variable is

a) 0

b) 1

c) Infinite

d) Cannot be determined

View Answer

Explanation: Entropy of a random variable is also infinity.

11. Which is more efficient method?

a) Encoding each symbol of a block

b) Encoding block of symbols

c) Both of the mentioned

d) None of the mentioned

View Answer

Explanation: Encoding block of symbols is more efficient than encoding each symbol of a block.

12. Lempel-Ziv algorithm is

a) Variable to fixed length algorithm

b) Fixed to variable length algorithm

c) Fixed to fixed length algorithm

d) Variable to variable length algorithm

View Answer

Explanation: Lempel-Ziv algorithm is a variable to fixed length algorithm.

13. Coded system are inherently capable of better transmission efficiency than uncoded system.

a) True

b) False

View Answer

Explanation: Yes, the coded systems are capable of better transmission efficiency than uncoded system.

**Sanfoundry Global Education & Learning Series – Digital Communications.**

To practice all areas of Digital Communications, __here is complete set of 1000+ Multiple Choice Questions and Answers__.