BLIND PERFORMANCE ESTIMATION AND QUANTIZER DESIGN WITH APPLICATIONS TO RELAY NETWORKS

In this thesis, we introduce blind estimators for several performance metrics of Bayesian detectors, we study rate-information-optimal quantization and introduce algorithms for quantizer design in the communications context, and we apply our results to a relay-based cooperative communication scheme. After a discussion of the background material which serves as a basis for this thesis, we study blind performance estimation for Bayesian detectors. We consider simple binary and M -ary hypothesis tests and introduce blind estimators for the conditional and unconditional error probabilities, the minimum mean-square error (MSE), and the mutual information. The proposed blind estimators are shown to be unbiased and consistent. Furthermore, we compare the blind estimators for the error probabilities to the corresponding nonblind estimators and we give conditions under which the blind estimators dominate their respective nonblind counterpart for arbitrary distributions of the data. In particular, we show that the blind estimator for the unconditional error probability always dominates the corresponding nonblind estimator in terms of the MSE. Subsequently, the Cramér-Rao lower bound for bit error probability estimation under maximum a posteriori detection is derived. Moreover, it is shown that an efficient estimator does not exist for this problem. Application examples conclude the discussion of blind performance estimators. We then introduce an approach to quantization that we call rate-information quantization. The main idea of rate-information-optimal quantization is to compress data such that its quantized representation is as informative as possible about another random variable. This random variable is called the relevance variable and it is correlated with the data. The rateinformation approach is well suited for communication problems, which is in contrast to ratedistortion (RD) quantization. We focus on the case where the data and the relevance variable are jointly Gaussian and we derive closed-form expressions for the optimal trade-off between the compression rate and the preserved information about the relevance variable. It is then shown that the optimal rate-information trade-off is achieved by suitable linear preprocessing of the data with subsequent MSE-optimal source coding. This result connects RD theory, the Gaussian information bottleneck, and minimum MSE estimation. Furthermore, we show

[1]  Wolfgang Rave Quantization of Log-Likelihood Ratios to Maximize Mutual Information , 2009, IEEE Signal Processing Letters.

[2]  Joachim Hagenauer,et al.  The exit chart - introduction to extrinsic information transfer in iterative processing , 2004, 2004 12th European Signal Processing Conference.

[3]  Nariman Farvardin,et al.  On the performance and complexity of channel-optimized vector quantizers , 1991, IEEE Trans. Inf. Theory.

[4]  P. Meyer,et al.  Finis coronat opus - Festschrift für Walter Kroll zum 65. Geburtstag , 2006 .

[5]  Rüdiger L. Urbanke,et al.  Modern Coding Theory , 2008 .

[6]  Robert G. Gallager,et al.  Low-density parity-check codes , 1962, IRE Trans. Inf. Theory.

[7]  Xiaodong Li,et al.  Bit-interleaved coded modulation with iterative decoding , 1997, IEEE Communications Letters.

[8]  Mikael Skoglund,et al.  Analog Network Coding Mappings in Gaussian Multiple-Access Relay Channels , 2010, IEEE Transactions on Communications.

[9]  Christoph Roth,et al.  Data mapping for unreliable memories , 2012, 2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[10]  Rudolf Ahlswede,et al.  Network information flow , 2000, IEEE Trans. Inf. Theory.

[11]  Dariush Divsalar,et al.  Serial Concatenation of Interleaved Codes: Performance Analysis, Design, and Iterative Decoding , 1997, IEEE Trans. Inf. Theory.

[12]  S. P. Lloyd,et al.  Least squares quantization in PCM , 1982, IEEE Trans. Inf. Theory.

[13]  Raphaël Visoz,et al.  Joint channel-network turbo coding for the non-orthogonal multiple access relay channel , 2010, 21st Annual IEEE International Symposium on Personal, Indoor and Mobile Radio Communications.

[14]  Joachim Hagenauer,et al.  A Viterbi algorithm with soft-decision outputs and its applications , 1989, IEEE Global Telecommunications Conference, 1989, and Exhibition. 'Communications Technology for the 1990s and Beyond.

[15]  Lars P. B. Christensen,et al.  Maximum Mutual Information Vector Quantization of Log-Likelihood Ratios for Memory Efficient HARQ Implementations , 2010, 2010 Data Compression Conference.

[16]  V. D. Pietra,et al.  Minimum Impurity Partitions , 1992 .

[17]  Michael Gastpar,et al.  Compute-and-Forward: Harnessing Interference Through Structured Codes , 2009, IEEE Transactions on Information Theory.

[18]  Stephan ten Brink,et al.  Convergence behavior of iteratively decoded parallel concatenated codes , 2001, IEEE Trans. Commun..

[19]  Joachim Hagenauer,et al.  Iterative decoding of binary block and convolutional codes , 1996, IEEE Trans. Inf. Theory.

[20]  Suguru Arimoto,et al.  An algorithm for computing the capacity of arbitrary discrete memoryless channels , 1972, IEEE Trans. Inf. Theory.

[21]  Van Nostrand,et al.  Error Bounds for Convolutional Codes and an Asymptotically Optimum Decoding Algorithm , 1967 .

[22]  Gerald Matz,et al.  On the generalized mutual information of BICM systems with approximate demodulation , 2010, 2010 IEEE Information Theory Workshop on Information Theory (ITW 2010, Cairo).

[23]  Petar Popovski,et al.  The Anti-Packets Can Increase the Achievable Throughput of a Wireless Multi-Hop Network , 2006, 2006 IEEE International Conference on Communications.

[24]  D.J.C. MacKay,et al.  Good error-correcting codes based on very sparse matrices , 1997, Proceedings of IEEE International Symposium on Information Theory.

[25]  Gerald Matz,et al.  Performance Assessment of MIMO-BICM Demodulators Based on Mutual Information , 2012, IEEE Transactions on Signal Processing.

[26]  Hans-Andrea Loeliger A posteriori probabilities and performance evaluation of trellis codes , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[27]  Naftali Tishby,et al.  The information bottleneck method , 2000, ArXiv.

[28]  R. Zamir Lattices are everywhere , 2009, 2009 Information Theory and Applications Workshop.

[29]  Georg Zeitler,et al.  On Quantizer Design for Soft Values in the Multiple-Access Relay Channel , 2009, 2009 IEEE International Conference on Communications.

[30]  Allen Gersho,et al.  Vector quantization and signal compression , 1991, The Kluwer international series in engineering and computer science.

[32]  Peter Adam Hoeher,et al.  Computation of symbol-wise mutual information in transmission systems with LogAPP decoders and application to EXIT charts , 2015 .

[33]  David G. Messerschmitt,et al.  Quantizing for maximum output entropy (Corresp.) , 1971, IEEE Trans. Inf. Theory.

[34]  Kerstin Vogler,et al.  Table Of Integrals Series And Products , 2016 .

[35]  Sailes K. Sengijpta Fundamentals of Statistical Signal Processing: Estimation Theory , 1995 .

[36]  Sichao Yang,et al.  Network Coding over a Noisy Relay : a Belief Propagation Approach , 2007, 2007 IEEE International Symposium on Information Theory.

[37]  Gal Chechik,et al.  Information Bottleneck for Gaussian Variables , 2003, J. Mach. Learn. Res..

[38]  Robert M. Gray,et al.  An Algorithm for Vector Quantizer Design , 1980, IEEE Trans. Commun..

[39]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[40]  A. Burg,et al.  Design and Optimization of an HSDPA Turbo Decoder ASIC , 2009, IEEE Journal of Solid-State Circuits.

[41]  Onurcan Iscan,et al.  Iterative Network and Channel Decoding for the Relay Channel with Multiple Sources , 2011, 2011 IEEE Vehicular Technology Conference (VTC Fall).

[42]  Rudolf Lide,et al.  Finite fields , 1983 .

[43]  Abbas El Gamal,et al.  Capacity theorems for the relay channel , 1979, IEEE Trans. Inf. Theory.

[44]  K. Conrad,et al.  Finite Fields , 2018, Series and Products in the Development of Mathematics.

[45]  Georg Zeitler,et al.  Cooperative Uplink of Two Mobile Stations with Network Coding Based on the WiMax LDPC Code , 2009, GLOBECOM 2009 - 2009 IEEE Global Telecommunications Conference.

[46]  Naftali Tishby,et al.  An Information Theoretic Tradeoff between Complexity and Accuracy , 2003, COLT.

[47]  Johannes B. Huber,et al.  Bounds on information combining , 2005, IEEE Transactions on Information Theory.

[48]  F. Jelinek Fast sequential decoding algorithm using a stack , 1969 .

[49]  Rüdiger L. Urbanke,et al.  The capacity of low-density parity-check codes under message-passing decoding , 2001, IEEE Trans. Inf. Theory.

[50]  T. Berger Rate-Distortion Theory , 2003 .

[51]  H. Vincent Poor,et al.  An Introduction to Signal Detection and Estimation , 1994, Springer Texts in Electrical Engineering.

[52]  Gerald Matz,et al.  Log-likelihood ratio clipping in MIMO-BICM systems: Information geometric analysis and impact on system capacity , 2009, 2009 IEEE International Conference on Acoustics, Speech and Signal Processing.

[53]  Sae-Young Chung,et al.  On the design of low-density parity-check codes within 0.0045 dB of the Shannon limit , 2001, IEEE Communications Letters.

[54]  Joel Max,et al.  Quantizing for minimum distortion , 1960, IRE Trans. Inf. Theory.

[55]  Eugene P. Wigner,et al.  Formulas and Theorems for the Special Functions of Mathematical Physics , 1966 .

[56]  M. Kreĭn,et al.  On extreme points of regular convex sets , 1940 .

[57]  J. Hagenauer A soft-in/soft-out list sequential(liss) decoder for turbo schemes , 2003, IEEE International Symposium on Information Theory, 2003. Proceedings..

[58]  Gerald Matz,et al.  Blind estimation of bit and block error probabilities using soft information , 2012, 2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[59]  Thomas M. Cover,et al.  Elements of information theory (2. ed.) , 2006 .

[60]  Johannes B. Huber,et al.  Information Combining , 2006, Found. Trends Commun. Inf. Theory.

[61]  Hideki Yagi,et al.  Concatenation of a discrete memoryless channel and a quantizer , 2010, 2010 IEEE Information Theory Workshop on Information Theory (ITW 2010, Cairo).

[62]  Gerald Matz,et al.  The effect of unreliable LLR storage on the performance of MIMO-BICM , 2010, 2010 Conference Record of the Forty Fourth Asilomar Conference on Signals, Systems and Computers.

[63]  Vijayvaradharaj T. Muralidharan,et al.  Physical Layer Network Coding for the K-User Multiple Access Relay Channel , 2013, IEEE Trans. Wirel. Commun..

[64]  H. Cramér Mathematical methods of statistics , 1947 .

[65]  Michael Gastpar,et al.  Cooperative strategies and capacity theorems for relay networks , 2005, IEEE Transactions on Information Theory.

[66]  John B. Anderson,et al.  Concatenated Decoding with a Reduced-Search BCJR Algorithm , 1998, IEEE J. Sel. Areas Commun..

[67]  D. Anderson,et al.  Algorithms for minimization without derivatives , 1974 .

[68]  Elza Erkip,et al.  User cooperation diversity. Part I. System description , 2003, IEEE Trans. Commun..

[69]  Ragnar Thobaben,et al.  Blind quality estimation for corrupted source signals based on a-posteriori probabilities , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[70]  Gerald Matz,et al.  Low-complexity MIMO-BICM receivers with imperfect channel state information: Capacity-based performance comparison , 2010, 2010 IEEE 11th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC).

[71]  Georg Zeitler,et al.  Low-Precision A/D Conversion for Maximum Information Rate in Channels with Memory , 2012, IEEE Transactions on Communications.

[72]  Georg Zeitler,et al.  Quantize-and-Forward Schemes for the Orthogonal Multiple-Access Relay Channel , 2012, IEEE Transactions on Communications.

[73]  Gerald Matz,et al.  On the relation between the Gaussian information bottleneck and MSE-optimal rate-distortion quantization , 2014, 2014 IEEE Workshop on Statistical Signal Processing (SSP).

[74]  Gottfried Lechner,et al.  Improved Sum-Min Decoding for Irregular LDPC Codes , 2006 .

[75]  Shu Lin,et al.  Error Control Coding , 2004 .

[76]  N. Gortz On the iterative approximation of optimal joint source-channel decoding , 2001 .

[77]  L. Baum,et al.  A Maximization Technique Occurring in the Statistical Analysis of Probabilistic Functions of Markov Chains , 1970 .

[78]  Richard E. Blahut,et al.  Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.

[79]  Michael Gastpar,et al.  Reliable Physical Layer Network Coding , 2011, Proceedings of the IEEE.

[80]  Michael Gastpar,et al.  Computing over Multiple-Access Channels with Connections to Wireless Network Coding , 2006, 2006 IEEE International Symposium on Information Theory.

[81]  Robert Mario Fano,et al.  A heuristic discussion of probabilistic decoding , 1963, IEEE Trans. Inf. Theory.

[82]  Andreas Winkelbauer,et al.  Moments and Absolute Moments of the Normal Distribution , 2012, ArXiv.

[83]  Georgios B. Giannakis,et al.  Complex Field Network Coding for Multiuser Cooperative Communications , 2008, IEEE Journal on Selected Areas in Communications.

[84]  M. Melamed Detection , 2021, SETI: Astronomy as a Contact Sport.

[85]  Gerald Matz,et al.  The rate-information trade-off for Gaussian vector channels , 2014, 2014 IEEE International Symposium on Information Theory.

[86]  R. Gray Source Coding Theory , 1989 .

[87]  Shu Lin,et al.  Channel Codes: Classical and Modern , 2009 .

[88]  Ingmar Land,et al.  Log-likelihood values and Monte Carlo simulation - some fundamental results , 2000 .

[89]  Jr Phllip,et al.  The Function Inverfc , 1960 .

[90]  Brian M. Kurkoski,et al.  Finding the capacity of a quantized binary-input DMC , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[91]  Soung Chang Liew,et al.  Hot topic: physical-layer network coding , 2006, MobiCom '06.

[92]  Abbas Jamalipour,et al.  Wireless communications , 2005, GLOBECOM '05. IEEE Global Telecommunications Conference, 2005..

[93]  Nariman Farvardin,et al.  A study of vector quantization for noisy channels , 1990, IEEE Trans. Inf. Theory.

[94]  Andrew C. Singer,et al.  Turbo equalization: principles and new results , 2002, IEEE Trans. Commun..

[95]  John Cocke,et al.  Optimal decoding of linear codes for minimizing symbol error rate (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[96]  Gerald Matz,et al.  On efficient soft-input soft-output encoding of convolutional codes , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[97]  Dimitri P. Bertsekas,et al.  Nonlinear Programming , 1997 .

[98]  Gerald Matz,et al.  Rate-information-optimal Gaussian channel output compression , 2014, 2014 48th Annual Conference on Information Sciences and Systems (CISS).

[99]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[100]  Christoph Hausl,et al.  Joint Network-Channel Coding for the Multiple-Access Relay Channel , 2006, 2006 3rd Annual IEEE Communications Society on Sensor and Ad Hoc Communications and Networks.

[101]  X. Jin Factor graphs and the Sum-Product Algorithm , 2002 .

[102]  Gerald Matz,et al.  Compress-and-forward in the multiple-access relay channel: With or without network coding? , 2012, 2012 7th International Symposium on Turbo Codes and Iterative Information Processing (ISTC).

[103]  Stefano Tomasin,et al.  LLR Compression for BICM Systems Using Large Constellations , 2012, IEEE Transactions on Communications.

[104]  Gerald Matz,et al.  Channel-optimized vector quantization with mutual information as fidelity criterion , 2013, 2013 Asilomar Conference on Signals, Systems and Computers.

[105]  Gerald Matz,et al.  Quantization for soft-output demodulators in bit-interleaved coded modulation systems , 2009, 2009 IEEE International Symposium on Information Theory.

[106]  Li Ping,et al.  The Factor Graph Approach to Model-Based Signal Processing , 2007, Proceedings of the IEEE.

[107]  Hans S. Witsenhausen,et al.  A conditional entropy bound for a pair of discrete random variables , 1975, IEEE Trans. Inf. Theory.

[108]  Gerald Matz,et al.  Joint network-channel coding in the multiple-access relay channel: Beyond two sources , 2012, 2012 5th International Symposium on Communications, Control and Signal Processing.

[109]  Joachim Hagenauer Soft is Better Than Hard , 1994 .

[110]  F. Pollara,et al.  Serial concatenation of interleaved codes: performance analysis, design and iterative decoding , 1996, Proceedings of IEEE International Symposium on Information Theory.

[111]  Peter Adam Hoeher,et al.  New Results on Monte Carlo Bit Error Simulation Based on the A Posteriori Log-Likelihood Ratio , 2003 .

[112]  Elza Erkip,et al.  User cooperation diversity. Part II. Implementation aspects and performance analysis , 2003, IEEE Trans. Commun..

[113]  Yang Yang,et al.  Relay technologies for WiMax and LTE-advanced mobile systems , 2009, IEEE Communications Magazine.

[114]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[115]  Gerald Matz,et al.  Joint network-channel coding for the asymmetric multiple-access relay channel , 2012, 2012 IEEE International Conference on Communications (ICC).

[116]  J. Widmer,et al.  Design of network coding functions in multihop relay networks , 2008, 2008 5th International Symposium on Turbo Codes and Related Topics.

[117]  Niclas Wiberg,et al.  Codes and Decoding on General Graphs , 1996 .

[118]  C. R. Rao,et al.  Information and the Accuracy Attainable in the Estimation of Statistical Parameters , 1992 .