Parallel frequency function-deep neural network for efficient complex broadband signal approximation

A neural network is essentially a high-dimensional complex mapping model by adjusting network weights for feature fitting. However, the spectral bias in network training leads to unbearable training epochs for fitting the high-frequency components in broadband signals. To improve the fitting efficiency of high-frequency components, the PhaseDNN was proposed recently by combining complex frequency band extraction and frequency shift techniques [Cai et al. SIAM J. SCI. COMPUT. 42, A3285 (2020)]. Our paper is devoted to an alternative candidate for fitting complex signals with high-frequency components. Here, a parallel frequency function-deep neural network (PFF-DNN) is proposed to suppress computational overhead while ensuring fitting accuracy by utilizing fast Fourier analysis of broadband signals and the spectral bias nature of neural networks. The effectiveness and efficiency of the proposed PFF-DNN method are verified based on detailed numerical experiments for six typical broadband signals.

[1]  Liwei Wang,et al.  The Expressive Power of Neural Networks: A View from the Width , 2017, NIPS.

[2]  Yoshua Bengio,et al.  On the Spectral Bias of Neural Networks , 2018, ICML.

[3]  H. Nussbaumer Fast Fourier transform and convolution algorithms , 1981 .

[4]  Patrick Kidger,et al.  Universal Approximation with Deep Narrow Networks , 2019, COLT 2019.

[5]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[6]  Zheng Ma,et al.  Frequency Principle: Fourier Analysis Sheds Light on Deep Neural Networks , 2019, Communications in Computational Physics.

[7]  L. Glass,et al.  Oscillation and chaos in physiological control systems. , 1977, Science.

[8]  Geoffrey E. Hinton,et al.  Speech recognition with deep recurrent neural networks , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[9]  Steven Guan,et al.  Investigation of Neural Networks for Function Approximation , 2013, ITQM.

[10]  Sepp Hochreiter,et al.  Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) , 2015, ICLR.

[11]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..

[12]  Y Lu,et al.  A Sequential Learning Scheme for Function Approximation Using Minimal Radial Basis Function Neural Networks , 1997, Neural Computation.

[13]  Xiaoguang Li,et al.  A Phase Shift Deep Neural Network for High Frequency Approximation and Wave Problems , 2020, SIAM J. Sci. Comput..

[14]  Xiang-Gen Xia,et al.  System identification using chirp signals and time-variant filters in the joint time-frequency domain , 1997, IEEE Trans. Signal Process..

[15]  S. Nash,et al.  Numerical methods and software , 1990 .

[16]  Meng Joo Er,et al.  Dynamic fuzzy neural networks-a novel approach to function approximation , 2000, IEEE Trans. Syst. Man Cybern. Part B.

[17]  P. S. Lewis,et al.  Function approximation and time series prediction with neural networks , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[18]  Nei Kato,et al.  State-of-the-Art Deep Learning: Evolving Machine Intelligence Toward Tomorrow’s Intelligent Network Traffic Control Systems , 2017, IEEE Communications Surveys & Tutorials.

[19]  Zenghui Wang,et al.  Deep Convolutional Neural Networks for Image Classification: A Comprehensive Review , 2017, Neural Computation.

[20]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[21]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.