**1. Explain in detail about Quantization?**

Quantization

Quantization error

Uniform Quantization

Non-uniform Quantization

Mid rise Quantization

Mid tread Quantization

**2. Explain in detail about PCM and DPCM?**

**PCM**

Block diagram for transmitter and receiver

• On – Off Signaling

• Return to zero signaling

• Non Return to Zero signaling

Transmission path (Regenerated Repeaters)

• Equalization

• Timing circuit

• Decision making device

DPCM

Block diagram for transmitter and receiver

Working principle by prediction

3. Explain in detail about delta modulation and Adaptive delta modulation?

Block diagram for transmitter and receiver

Delta modulator response

Hunting

Slope overloading

Block diagram for Adaptive delta modulation

4. Explain how 8 bits per samples is reduced into 4 bit per samples?

Block diagram for Adaptive quantization with forward estimation

Block diagram for Adaptive quantization with backward estimation

Block diagram for Adaptive prediction with forward estimation

Block diagram for Adaptive prediction with backward estimation

5. Explain Adaptive sub band coding?

Block diagram of ASBC encoder

Block diagram of ASBC decoder

UNIT III

6. Explain Linear Block Code?

Derivation of linear block code

Generator Matrix

Parity check matrix

Syndrome decoding

• Properties of syndrome

7. Explain cyclic code?

Derivation of Cyclic codes

Generator polynomial

Parity check polynomial

Syndrome polynomial

• Properties of syndrome

8. Explain Convolutional codes?

Design the convolutional encoder with the following concepts

M-stage shift register

n modulo-2 adders

Constraint length

Code rate

Generator polynomial

9. Write the procedures for designing an Encoder circuit?

Multiplication of the message polynomial m (x) by x n-k

Division of x n-k m (x) by the generator polynomial g (x) to obtain the remainder b (x) and

Addition of b (x) to x n-k m (x) to form the desired code polynomial.

To implement all such procedures we need the following requirements

Flip- flops

Modulo – 2 adders

Gate

Switch

With the gate turned on and the switch is in position 1, the information digits are shifted into the register and simultaneously into the communication channel. As soon as the ‘k’ information digits have been shifted into the register the register contains the parity check bits.

With the gate turned off and the switch is in position 2 the contents of the shift register are shifted into the channel.

10. Write the procedures for designing a syndrome calculator circuit?

To implement all such procedures we need the following requirements

Flip- flops

Modulo – 2 adders

Gate

Switch

This is identical to the encoder circuit except that the received bits are fed into the (n-k) stages of the feed back shift register from the left with gate 2 open and gate 1 is closed. As soon as all the received bits have been shifted into the shift register the contents of the shift register defines the syndrome s.

UNIT IV

11. Explain various compression principles?

Source encoders and destination decoders

Loss less and lossy compression

Entropy encoding

• Run-length encoding

• Statistical encoding

Source encoding

Differential encoding

Transform encoding

12. Explain Static and Dynamic Huffman coding?

Static Huffman coding

- Root node, Branch node and Leaf node

- Figure for tree creation

Dynamic Huffman coding

- Both transmitter and receiver has a single empty leaf node

- Read the first character

- Since the tree is initially empty ASCII representation of the first

character is sent.

- Immediately the character is assigned in the tree

- Check whether the tree is optimum (or) not

- If it is not optimum, the nodes are rearranged to satisfy the optimum

condition

-For each subsequent character the encoder checks whether the character

is already present in the tree or not.

- If it is present, the corresponding code word is send

-If it is not present, the encoder sends the current code word for the empty

leaf

- This is taken place in the decoder side also.

13. Explain digitized documents?

- Termination code table

- Make up code table

- Modified Huffman table

- Over scanning

- One-dimensional coding

- Two-dimensional coding

- Types of modes

Pass mode

Vertical mode

Horizontal mode

14. Explain the various stages of JPEG?

- Image / Block preparation

- Forward DCT

- Quantization

- Entropy Encoding

• Vectoring

• Differential encoding

• Run-length encoding

• Huffman encoding

- Frame building

- JPEG decoding

15. Write short notes on GIF and TIFF

GIF

- Graphics interchange format

- Color images can be represented by 24-bit pixels

- Global color table

- Local color table

- Extending the table by using Lempel-Ziv coding algorithm

- Interlaced mode

TIFF

- Tagged Image File Format

- Used in images and digitized documents

- Represented by 48-bit pixels

- Code numbers are used

UNIT V

16. Explain linear predictive coding and Code excited linear predictive coding?

LPC

- Perceptual features

Pitch

Period

Loudness

Origin

- Vocal tract excitation parameters

Voiced sounds

Unvoiced sounds

- Diagram for LPC encoder and decoder

CELP

- Enhanced excitation model

- Used in Limited bandwidth

- Waveform template

- Template codebook

- ITU - T Recommendation standards

- Processing delay

- Algorithmic delay

- Look ahead

17. Explain Video compression principles?

- Frame types

o I Frames

o P frames

o B frames

o PB frames

o D frames

- Motion estimation

- Motion compensation

18. Explain MPEG audio coders and DOLBY audio coders?

MPEG audio coders

- Diagram for encoding operation

- Diagram for decoding operation

DOLBY audio coders

Forward adaptive bit allocation

Fixed bit allocation

Backward adaptive bit allocation

Hybrid backward/forward adaptive bit allocation

19. Write short notes on H.261?

Macro block format

Frame/picture format

GOB structure

20. Explain in detail about MPEG?

MPEG - 1

• MPEG – 1 frame sequence

• MPEG – 1 Video bit stream structure

MPEG – 2

• HDTV

• MP@ML

MPEG – 4

• Content based functionalities

• AVO’s

• VOP’s

UNIT I

21. Problems using Huffman coding.

There are three phases

• Generation of Huffman code

i. Arrange the given source symbols in descending order with respect to its probability

ii. If it is a binary Huffman coding add the last source values into a single unit

and placed in a new column with other values.

iii. Once again arrange the source values n decreasing order as obtained in step 2.

iv. Continue the process until only 2 source symbols are left.

v. Start assigning codes (0,1) in the backward direction towards the initial stage.

_

• Determination of H() and L

• Check the condition for validity by using source coding theorem. If the condition satisfies calculate

coding efficiency and code redundancy.

22. Problems using Shanno-Fano coding.

There are three phases

• Generation of Shanno-Fano code

i. List the source symbols in descending order with respect to its probability

ii. Partition the symbol (or) sample (or ) ensemble into almost equi-probable groups.

iii. Assign ’0’ to one group and ‘1’ to the other group.

iv. Repeat steps (ii) and (iii) on each of the subgroups until only one source symbol is left

_

v. Determination of H() and L

vi. Check the condition for validity by using source coding theorem. If the condition

satisfies calculate coding efficiency and code redundancy.

23. Problems using extension property.

• Calculate the entropy of the source.

2

If it is a second order extension then H( ) = 2* H()

3

If it is a third order extension then H( ) = 3* H()

24. Problems for calculating all entropies.

• Calculate source entropy H( )

• Calculate destination entropy H( )

• Calculate Joint entropy H( , )

• Calculate Conditional entropy H ( / )

• Calculate Conditional entropy H ( / )

Check by entropy in-equalities

• 0 H ( / ) H( )

• 0 H ( / ) H( )

• H( , ) H( ) + H( )

25. Write the properties of mutual information?

• Mutual information of a channel is symmetric

I( ; ) = I( ; )

• Mutual information is always non-negative

I( ; ) 0

• Mutual information is related to joint entropy

( ; ) = H( ) + H( ) - H( , )

26. Explain various coding theorems?

• Shannon first theorem (or) Source coding theorem (or) Shannon noiseless coding theorem.

• Shannon second theorem (or) Channel coding theorem.

• Shannon third theorem (or) Information capacity theorem (or) Channel capacity theorem.

## 0 comments:

Post a Comment