Performance analysis of the modified maximum likelihood sequence detector in the presence of data-dependent noise

The use of a modified maximum-likelihood sequence detector (MLSD) in a data-dependent noise channel is proposed. A new error metric based on nonstationary transition-dependent noise characteristics is derived. Simplified error metrics are suggested to reduce complexity. Performance is analyzed, noting that the error rate can be derived from a chi-square distribution. Simulation results show that the modified MLSD performs better than the normal MLSD in a jitter-dominant channel. It is shown that the logarithm term in the new error metric can be neglected without any loss of performance, resulting in reduced complexity.<<ETX>>