Complex-valued functional link network design by orthogonal least squares method for function approximation problems

This paper presents a fully complex-valued functional link network (CFLN). The CFLN is a single-layered neural network, which introduces nonlinearity in the input layer using nonlinear functions of the original input variables. In this study, we consider multivariate polynomials as the nonlinear functions. Unlike multilayer neural networks, the CFLN is free from local minima problem, and it offers very fast learning in parameters because of its linear structure. In the complex domain, polynomial based CFLN has an additional advantage of not requiring activation functions, which is a major concern in the complex-valued neural networks. However, it is important to select a smaller subset of polynomial terms (monomials) for faster and better performance, since the number of all possible monomials may be quite large. In this paper, we use the orthogonal least squares method in a constructive fashion (starting from lower degree to higher) for the selection of a parsimonious subset of monomials. Simulation results demonstrate that computing CFLN in purely complex domain is advantageous than in double-dimensional real domain, in terms of number of connection parameters, faster design, and possibly generalization performance. Moreover, our proposed CFLN compares favorably with several other multilayer networks in the complex domain.

[1]  S. Haykin Adaptive Filters , 2007 .

[2]  Mohamed Ibnkahla,et al.  Applications of neural networks to digital communications - a survey , 2000, Signal Process..

[3]  Sheng Chen,et al.  Orthogonal least squares methods and their application to non-linear system identification , 1989 .

[4]  José A. Macías,et al.  Evolution of functional link networks , 2001, IEEE Trans. Evol. Comput..

[5]  Cris Koutsougeras,et al.  Complex domain backpropagation , 1992 .

[6]  F.-C. Chen,et al.  On the learning and convergence of the radial basis networks , 1993, IEEE International Conference on Neural Networks.

[7]  Sheng Chen,et al.  Complex-valued radial basis function networks , 1993 .

[8]  Tülay Adali,et al.  Approximation by Fully Complex Multilayer Perceptrons , 2003, Neural Computation.

[9]  Dejan J. Sobajic,et al.  Learning and generalization characteristics of the random vector Functional-link net , 1994, Neurocomputing.

[10]  Colin Giles,et al.  Learning, invariance, and generalization in high-order neural networks. , 1987, Applied optics.

[11]  Richard D. Braatz,et al.  On the "Identification and control of dynamical systems using neural networks" , 1997, IEEE Trans. Neural Networks.

[12]  Narasimhan Sundararajan,et al.  Fully complex extreme learning machine , 2005, Neurocomputing.

[13]  Daesik Hong,et al.  Nonlinear blind equalization schemes using complex-valued multilayer feedforward neural networks , 1998, IEEE Trans. Neural Networks.

[14]  Saleem A. Kassam,et al.  Channel Equalization Using Adaptive Complex Radial Basis Function Networks , 1995, IEEE J. Sel. Areas Commun..

[15]  Tho Le-Ngoc,et al.  Polynomial perceptrons and their applications to fading channel equalization and co-channel interference suppression , 1994, IEEE Trans. Signal Process..

[16]  Tülay Adali,et al.  Fully Complex Multi-Layer Perceptron Network for Nonlinear Signal Processing , 2002, J. VLSI Signal Process..

[17]  Goutam Chakraborty,et al.  Nonlinear channel equalization for wireless communication systems using Legendre neural networks , 2009, Signal Process..

[18]  Narasimhan Sundararajan,et al.  Communication channel equalization using complex-valued minimal radial basis function neural networks , 2002, IEEE Trans. Neural Networks.

[19]  Narasimhan Sundararajan,et al.  Complex-valued function approximation using a Fully Complex-valued RBF (FC-RBF) learning algorithm , 2009, 2009 International Joint Conference on Neural Networks.

[20]  Narasimhan Sundararajan,et al.  A self-regulated learning in Fully Complex-valued Radial Basis Function Networks , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[21]  Lajos Hanzo,et al.  Fully complex-valued radial basis function networks: Orthogonal least squares regression and classification , 2008, Neurocomputing.

[22]  Sung-Bae Cho,et al.  A comprehensive survey on functional link neural networks and an adaptive PSO–BP learning for CFLNN , 2010, Neural Computing and Applications.

[23]  Tohru Nitta,et al.  An Extension of the Back-Propagation Algorithm to Complex Numbers , 1997, Neural Networks.

[24]  Yoh-Han Pao,et al.  Adaptive pattern recognition and neural networks , 1989 .

[25]  Francesco Piazza,et al.  On the complex backpropagation algorithm , 1992, IEEE Trans. Signal Process..

[26]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[27]  Ganapati Panda,et al.  Nonlinear channel equalization for QAM signal constellation using artificial neural networks , 1999, IEEE Trans. Syst. Man Cybern. Part B.

[28]  Akira Hirose,et al.  Complex-Valued Neural Networks , 2006, Studies in Computational Intelligence.

[29]  Sundaram Suresh,et al.  A Fully Complex-Valued Radial Basis Function Network and its Learning Algorithm , 2009, Int. J. Neural Syst..

[30]  Shang-Liang Chen,et al.  Orthogonal least squares learning algorithm for radial basis function networks , 1991, IEEE Trans. Neural Networks.