An Application of Nonlinear Programming to Mismatched Filters
暂无分享,去创建一个
A burst waveform is a finite sequence of pulses with a staggered PRF. It is used as a high-resolution radar waveform. The ambiguity function of a burst waveform has a good peak-to-sidelobe ratio along the range axis. But, along the Doppler axis, its peak-to-sidelobe ratio is not nearly as good. A mismatched receiving filter is the logical way to increase the peak-to-sidelobe ratio of the ambiguity function along the Doppler axis. Taylor weighting only suppresses the Doppler sidelobes that are close to the main peak. In this paper, we derive, by the techniques of nonlinear programming, an iterative method for calculating a mismatched filter that is optimum in the following sense. Let an interval D of the Doppler axis be specified, as well as a desired peak-to-sidelobe ratio W. Then our method will calculate the mismatched filter with optimum signal-to-noise ratio that reduces the Doppler sidelobes to the specified level over the specified interval, if such a filter exists. If such a filter does not exist, then the calculated filter will still tend to suppress the sidelobes over the specified interval of the Doppler axis.
[1] G. Zoutendijk,et al. Methods of Feasible Directions , 1962, The Mathematical Gazette.
[2] C. Stutt. A note on invariant relations for ambiguity and distance functions , 1959, IRE Trans. Inf. Theory.
[3] T. Taylor. Design of line-source antennas for narrow beamwidth and low side lobes , 1955 .
[4] J. Klauder,et al. The theory and design of chirp radars , 1960 .
[5] Joel B Resnick. High resolution waveforms suitable for a multiple target environment , 1962 .