## Covering properties of convolutional codes and associated lattices

This talk describes methods for analyzing the expected and worst-case performance of sequence based methods of quantization. We suppose that the quantization algorithm is dynamic programming, where the current step depends on a vector of path metrics, which we call a metric function. Our principal objective is a concise representation of these metric functions and the possible trajectories of the dynamic programming algorithm. We shall consider quantization of equiprobable binary data using a convolutional code. Here the additive group of the code splits the set of metric functions into a finite collection of subsets. The subsets form the vertices of a directed graph, where edges are labelled by aggregate incremental increases in mean squared error (msc). Paths in this graph correspond both to trajectories of the Viterbi algorithm, and to cosets of the code. For the rate 1/2 convolutional code [1 + D2, 1 + D + D2], this graph has only 9 vertices. In this case it is particularly simple to calculate per dimension expected and worst case mse, and performance is similar to the binary [24, 12] Colay code. Our methods also apply to quantization of arbitrary sysmmetric probability distributions on [0, 1] using convolutional codes. For the uniform distribution on [0, 1], the expected mse is the second moment of the 'Voronoi region' of an infinite dimensional lattice determined by the convolutional code. It may also be interpreted as an increase in the reliability of a transmission scheme obtained by nonequiprobable signalling. For certain convolutional codes we obtain a formula for expected mse that depends only on the distribution of differences for a single pair of path metrics.

### Duke Scholars

## Published In

## Publication Date

## Start / End Page

### Citation

*Proceedings of the 1993 IEEE International Symposium on Information Theory*, 141.

*Proceedings of the 1993 IEEE International Symposium on Information Theory*, January 1, 1993, 141.

*Proceedings of the 1993 IEEE International Symposium on Information Theory*, Jan. 1993, p. 141.