The memory required to store the context model for a PPM-style compressor increases exponentially with the order of the model (i.e., length of context). It is a challenging research problem to find ways to reduce the memory requirement of a large context model without sacrificing its coding efficiency. In this paper, we focus on bi-level image coding and investigate context reduction by clustering: that is, contexts predicting similar probability distributions are grouped together to share a common entropy coder. We give an O(kn) algorithm for optimally grouping n contexts into k clusters so that the total loss in coding efficiency is minimized. Previously no algorithm was known for solving this problem. We demonstrate the effectiveness of clustering by implementing a two-level compression scheme. Experimental results on the CCITT test images show that, using the same amount of memory, our scheme achieves better compression than the two-level PPM method of A. Moffat (1991).
[1]
F. Frances Yao,et al.
Efficient dynamic programming using quadrangle inequalities
,
1980,
STOC '80.
[2]
Alok Aggarwal,et al.
Geometric Applications of a Matrix Searching Algorithm
,
1986,
Symposium on Computational Geometry.
[3]
A. Moffat.
Two-level context based compression of binary images
,
1991,
[1991] Proceedings. Data Compression Conference.
[4]
Xiaolin Wu,et al.
Optimal Quantization by Matrix Searching
,
1991,
J. Algorithms.
[5]
Alok Aggarwal,et al.
Finding a minimum-weightk-link path in graphs with the concave Monge property and applications
,
1994,
Discret. Comput. Geom..
[6]
Shmuel Tomi Klein,et al.
An overhead reduction technique for mega-state compression schemes
,
1997,
Proceedings DCC '97. Data Compression Conference.