Loss of optimality in cross correlators

In simple binary detection, cross correlation gives the best possible output signal-to-noise ratio only if the background noise is white. But, because of the ease of implementation, optical cross correlators are often used even in nonwhite-noise situations. General expressions and tight bounds are derived to quantify the loss in signal-to-noise ratio when a cross correlator is used instead of the truly optimal filter. Such a quantification should enable the designer of any optical detection scheme to determine whether the extra effort involved in using the truly optimal filters is worth the signal-to-noise ratio improvements.