In distributed classification applications, due to computational constraints, data acquired by low complexity clients is compressed and transmitted to a remote server for classification. In this paper the design of optimal quantization for distributed classification applications is considered and evaluated in the context of a speech recognition task. The proposed encoder minimizes the detrimental effect compression has on classification performance. Specifically, the proposed methods concentrate on designing low dimension encoders. Here individual encoders independently quantize sub-dimensions of a high dimension vector used for classification. The main novelty of the work is the introduction of mutual information as a metric for designing compression algorithms in classification applications. Given a rate constraint, the proposed algorithm minimizes the mutual information loss due to compression. Alternatively it ensures that the compressed data used for classification retains maximal information about the class labels. An iterative empirical algorithm (similar to the Lloyd algorithm) is provided to design quantizers for this new distortion measure. Additionally, mutual information is also used to propose a rate-allocation scheme where rates are allocated to the sub-dimensions of a vector (which are independently encoded) to satisfy a given rate constraint. The results obtained indicate that mutual information is a better metric (when compared to mean square error) for optimizing encoders used in distributed classification applications. In a distributed spoken names recognition task, the proposed mutual information based rate-allocation reduces by a factor of six the increase in WER due to compression when compared to a heuristic rateallocation.
[1]
Robert M. Gray,et al.
Minimum Cross-Entropy Pattern Classification and Cluster Analysis
,
1982,
IEEE Transactions on Pattern Analysis and Machine Intelligence.
[2]
R. Gray,et al.
Combining Image Compression and Classification Using Vector Quantization
,
1995,
IEEE Trans. Pattern Anal. Mach. Intell..
[3]
Renato De Mori,et al.
High-performance connected digit recognition using maximum mutual information estimation
,
1994,
IEEE Trans. Speech Audio Process..
[4]
Antonio Ortega,et al.
Reduced complexity quantization under classification constraints
,
2002,
Proceedings DCC 2002. Data Compression Conference.
[5]
Philip A. Chou,et al.
Entropy-constrained vector quantization
,
1989,
IEEE Trans. Acoust. Speech Signal Process..
[6]
Naftali Tishby,et al.
The information bottleneck method
,
2000,
ArXiv.
[7]
Eve A. Riskin,et al.
Optimal bit allocation via the generalized BFOS algorithm
,
1991,
IEEE Trans. Inf. Theory.
[8]
Thomas M. Cover,et al.
Elements of Information Theory
,
2005
.