An Application of Nonlinear Programming to Mismatched Filters

A burst waveform is a finite sequence of pulses with a staggered PRF. It is used as a high-resolution radar waveform. The ambiguity function of a burst waveform has a good peak-to-sidelobe ratio along the range axis. But, along the Doppler axis, its peak-to-sidelobe ratio is not nearly as good. A mismatched receiving filter is the logical way to increase the peak-to-sidelobe ratio of the ambiguity function along the Doppler axis. Taylor weighting only suppresses the Doppler sidelobes that are close to the main peak. In this paper, we derive, by the techniques of nonlinear programming, an iterative method for calculating a mismatched filter that is optimum in the following sense. Let an interval D of the Doppler axis be specified, as well as a desired peak-to-sidelobe ratio W. Then our method will calculate the mismatched filter with optimum signal-to-noise ratio that reduces the Doppler sidelobes to the specified level over the specified interval, if such a filter exists. If such a filter does not exist, then the calculated filter will still tend to suppress the sidelobes over the specified interval of the Doppler axis.