Source Coding Theory
暂无分享,去创建一个
1 Information Sources.- 1.1 Probability Spaces.- 1.2 Random Variables and Vectors.- 1.3 Random Processes.- 1.4 Expectation.- 1.5 Ergodic Properties.- Exercises.- 2 Codes, Distortion, and Information.- 2.1 Basic Models of Communication Systems.- 2.2 Code Structures.- 2.3 Code Rate.- 2.4 Code Performance.- 2.5 Optimal Performance.- 2.6 Information.- 2.6.1 Information and Entropy Rates.- 2.7 Limiting Properties.- 2.8 Related Reading.- Exercises.- 3 Distortion-Rate Theory.- 3.1 Introduction.- 3.2 Distortion-Rate Functions.- 3.3 Almost Noiseless Codes.- 3.4 The Source Coding Theorem for Block Codes.- 3.4.1 Block Codes.- 3.4.2 A Coding Theorem.- 3.5 Synchronizing Block Codes.- 3.6 Sliding-Block Codes.- 3.7 Trellis Encoding.- Exercises.- 4 Rate-Distortion Functions.- 4.1 Basic Properties.- 4.2 The Variational Equations.- 4.3 The Discrete Shannon Lower Bound.- 4.4 The Blahut Algorithm.- 4.5 Continuous Alphabets.- 4.6 The Continuous Shannon Lower Bound.- 4.7 Vectors and Processes.- 4.7.1 The Wyner-Ziv Lower Bound.- 4.7.2 The Autoregressive Lower Bound.- 4.7.3 The Vector Shannon Lower Bound.- 4.8 Norm Distortion.- Exercises.- 5 High Rate Quantization.- 5.1 Introduction.- 5.2 Asymptotic Distortion.- 5.3 The High Rate Lower Bound.- 5.4 High Rate Entropy.- 5.5 Lattice Vector Quantizers.- 5.6 Optimal Performance.- 5.7 Comparison of the Bounds.- 5.8 Optimized VQ vs. Uniform Quantization.- 5.9 Quantization Noise.- Exercises.- 6 Uniform Quantization Noise.- 6.1 Introduction.- 6.2 Uniform Quantization.- 6.3 PCM Quantization Noise: Deterministic Inputs.- 6.4 Random Inputs and Dithering.- 6.5 Sigma-Delta Modulation.- 6.6 Two-Stage Sigma-Delta Modulation.- 6.7 Delta Modulation.- Exercises.