An approximation of the Gilbert-Elliott channel via a queue-based channel model

We investigate the modeling of the well-known burst-noise Gilbert-Elliott channel (GEC) using a recently introduced queue-based channel (QBC) model. The QBC parameters are estimated by minimizing the Kullback-Leibler divergence rate between the probability of error sequences generated by the QBC and the GEC, while maintaining identical bit error rates and correlation coefficients. The accuracy of fitting the GEC via the QBC is evaluated in terms of channel capacity and autocorrelation function. Numerical results show that the QBC provides a very good approximation of the GEC for various channel conditions. It thus offers an interesting alternative to the GEC while remaining mathematically tractable.

[1]  Israel Bar-David,et al.  Capacity and coding for the Gilbert-Elliot channels , 1989, IEEE Trans. Inf. Theory.