LEARNING-THEORETIC METHODS IN VECTOR QUANTIZATION
暂无分享,去创建一个
[1] J. MacQueen. Some methods for classification and analysis of multivariate observations , 1967 .
[2] Herbert Gish,et al. Asymptotically efficient quantizing , 1968, IEEE Trans. Inf. Theory.
[3] R. Tyrrell Rockafellar,et al. Convex Analysis , 1970, Princeton Landmarks in Mathematics and Physics.
[4] Toby Berger. Optimum quantizers and permutation codes , 1972, IEEE Trans. Inf. Theory.
[5] Robert M. Gray,et al. Quantizer Mismatch , 1975, IEEE Trans. Commun..
[6] R. Gray,et al. A Generalization of Ornstein's $\bar d$ Distance with Applications to Information Theory , 1975 .
[7] S. Szarek. On the best constants in the Khinchin inequality , 1976 .
[8] E. Slud. Distribution Inequalities for the Binomial Law , 1977 .
[9] R. Dudley. Central Limit Theorems for Empirical Measures , 1978 .
[10] Robert M. Gray,et al. An Algorithm for Vector Quantizer Design , 1980, IEEE Trans. Commun..
[11] D. Pollard. Strong Consistency of $K$-Means Clustering , 1981 .
[12] Gary L. Wise,et al. On the existence of optimal quantizers , 1982, IEEE Trans. Inf. Theory.
[13] S. P. Lloyd,et al. Least squares quantization in PCM , 1982, IEEE Trans. Inf. Theory.
[14] D. Pollard. A Central Limit Theorem for $k$-Means Clustering , 1982 .
[15] Paul L. Zador,et al. Asymptotic quantization error of continuous signals and the quantization dimension , 1982, IEEE Trans. Inf. Theory.
[16] David Pollard,et al. Quantization and the method of k -means , 1982, IEEE Trans. Inf. Theory.
[17] G. Wise,et al. Convergence of Vector Quantizers with Applications to Optimal Quantization , 1984 .
[18] Robert M. Gray,et al. Global convergence and empirical consistency of the generalized Lloyd algorithm , 1986, IEEE Trans. Inf. Theory.
[19] R. Gray. Source Coding Theory , 1989 .
[20] Philip A. Chou,et al. Entropy-constrained vector quantization , 1989, IEEE Trans. Acoust. Speech Signal Process..
[21] D. Pollard. Empirical Processes: Theory and Applications , 1990 .
[22] S. M. Perlmutter,et al. Training sequence size and vector quantizer performance , 1991, [1991] Conference Record of the Twenty-Fifth Asilomar Conference on Signals, Systems & Computers.
[23] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[24] Allen Gersho,et al. Vector quantization and signal compression , 1991, The Kluwer international series in engineering and computer science.
[25] P. Chou. The distortion of vector quantizers trained on n vectors decreases to the optimum as O/sub p/(1/n) , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.
[26] G. Lugosi,et al. Rates of convergence in the source coding theorem, in empirical quantizer design, and in universal lossy source coding , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.
[27] David A. Cohn,et al. Theory and Practice of Vector Quantizers Trained on Small Training Sets , 1994, IEEE Trans. Pattern Anal. Mach. Intell..
[28] László Györfi,et al. A Probabilistic Theory of Pattern Recognition , 1996, Stochastic Modelling and Applied Probability.
[29] Khalid Sayood,et al. Introduction to Data Compression , 1996 .
[30] Neri Merhav,et al. On the amount of statistical side information required for lossy data compression , 1997, IEEE Trans. Inf. Theory.
[31] Vahid Tarokh,et al. Existence of optimal prefix codes for infinite source alphabets , 1997, IEEE Trans. Inf. Theory.
[32] Tamás Linder,et al. Empirical quantizer design in the presence of source noise or channel noise , 1997, IEEE Trans. Inf. Theory.
[33] Vladimir Vapnik,et al. Statistical learning theory , 1998 .
[34] Assaf J. Zeevi. On the performance of vector quantizers empirically designed from dependent sources , 1998, Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225).
[35] David L. Neuhoff,et al. Quantization , 2022, IEEE Trans. Inf. Theory.
[36] Tamás Linder,et al. The minimax distortion redundancy in empirical quantizer design , 1997, Proceedings of IEEE International Symposium on Information Theory.
[37] P. A. Chou,et al. When optimal entropy-constrained quantizers have only a finite number of codewords , 1998, Proceedings. 1998 IEEE International Symposium on Information Theory (Cat. No.98CH36252).
[38] Peter L. Bartlett,et al. Neural Network Learning - Theoretical Foundations , 1999 .
[39] R. Ash,et al. Probability and measure theory , 1999 .
[40] S. Graf,et al. Foundations of Quantization for Probability Distributions , 2000 .
[41] Tamás Linder,et al. Optimal entropy-constrained scalar quantization of a uniform source , 2000, IEEE Trans. Inf. Theory.
[42] Tamás Linder. On the training distortion of vector quantizers , 2000, IEEE Trans. Inf. Theory.
[43] T. Linder,et al. On the structure of entropy-constrained scalar quantizers , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).
[44] Gábor Lugosi,et al. Pattern Classification and Learning Theory , 2002 .
[45] Dudley,et al. Real Analysis and Probability: Measurability: Borel Isomorphism and Analytic Sets , 2002 .
[46] Tamás Linder,et al. A Lagrangian formulation of Zador's entropy-constrained quantization theorem , 2002, IEEE Trans. Inf. Theory.
[47] Harald Luschgy,et al. Rates of convergence for the empirical quantization error , 2002 .
[48] Tamás Linder,et al. On the structure of optimal entropy-constrained scalar quantizers , 2002, IEEE Trans. Inf. Theory.