Theoretical performance analysis of maximum likelihood sequence estimation in intensity modulated short-haul fiber optic links

It is well known that the optimal receiver for channels with intersymbol interference (ISI) is the maximum likelihood sequence estimation (MLSE) receiver. The MLSE techniques achieve optimal performance by taking advantage of the deterministic correlations introduced by chromatic dispersion generated ISI and the known channel noise statistics. This paper provides a theoretical examination of MLSE based on the assumption that the performance of the optical links is dominated by the thermal noise in the detection and/or post detection circuitry. A noise-model assumption based on Gaussian distribution is chosen because the thermal noise limited operation tends to dominate in the short-haul and metropolitan networks containing no optical amplifiers.