A suboptimum maximum likelihood detector for severely distorted data signals using a sorting breadth-first strategy

The maximum likelihood sequence estimator (MLSE) has a very high computational complexity even if the Viterbi algorithm (VA) is used; hence its application in real-time systems is limited to cases of slight intersymbol interference (ISI). A suboptimum detector of low expense is presented which uses a sorting breadth-first strategy and pursues only the M most likely data sequences (M-algorithm (MA)). Its degradation compared to the VA can be controlled by the number of paths M, and it converges towards unity even for a few paths (M=2) if the overall impulse response of the transmission channel g has a short rise time. An extended expression for the error probability of the MLSE, including coloured noise and model errors is given.<<ETX>>