Complex generalized-mean neuron model and its applications

The key element of neurocomputing research in complex domain is the development of artificial neuron model with improved computational power and generalization ability. The non-linear activities in neuronal interactions are observed in biological neurons. This paper presents architecture of a neuron with a non-linear aggregation function for complex-valued signals. The proposed aggregation function is conceptually based on generalized mean of signals impinging on a neuron. This function is general enough and is capable of realizing various conventional aggregation functions as its special case. The generalized-mean neuron has a simpler structure and variation in the value of generalization parameter embraces higher order structure of a neuron. Hence, it can be used without the hassles of possible combinatorial explosion, as in higher order neurons. The superiority of proposed neuron based network over real and complex multilayer perceptron is demonstrated through variety of experiments.

[1]  Joydeep Ghosh,et al.  The pi-sigma network: an efficient higher-order neural network for pattern classification and function approximation , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[2]  Andrzej Piegat,et al.  Fuzzy Modeling and Control , 2001 .

[3]  Yoan Shin,et al.  A complex pi-sigma network and its application to equalizaton of nonlinear satellite channels , 1997, Proceedings of International Conference on Neural Networks (ICNN'97).

[4]  Tao Xiong,et al.  A combined SVM and LDA approach for classification , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[5]  Alan F. Murray,et al.  IEEE International Conference on Neural Networks , 1997 .

[6]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[7]  Francesco Piazza,et al.  On the complex backpropagation algorithm , 1992, IEEE Trans. Signal Process..

[8]  Christof Koch,et al.  Biophysics of Computation: Information Processing in Single Neurons (Computational Neuroscience Series) , 1998 .

[9]  David B. Fogel An information criterion for optimal neural network selection , 1991, IEEE Trans. Neural Networks.

[10]  Tülay Adali,et al.  Approximation by Fully Complex Multilayer Perceptrons , 2003, Neural Computation.

[11]  J. W. Brown,et al.  Complex Variables and Applications , 1985 .

[12]  Narasimhan Sundararajan,et al.  Communication channel equalization using complex-valued minimal radial basis function neural networks , 2002, IEEE Trans. Neural Networks.

[13]  Colin Giles,et al.  Learning, invariance, and generalization in high-order neural networks. , 1987, Applied optics.

[14]  Bartlett W. Mel,et al.  Information Processing in Dendritic Trees , 1994, Neural Computation.

[15]  W. Pedrycz,et al.  Generalized means as model of compensative connectives , 1984 .

[16]  Cris Koutsougeras,et al.  Complex domain backpropagation , 1992 .

[17]  Henry Leung,et al.  The complex backpropagation algorithm , 1991, IEEE Trans. Signal Process..

[18]  Prem Kumar Kalra,et al.  Learning with generalized-mean neuron model , 2006, Neurocomputing.

[19]  Tohru Nitta,et al.  An Extension of the Back-Propagation Algorithm to Complex Numbers , 1997, Neural Networks.

[20]  Didier Dubois,et al.  A review of fuzzy set aggregation connectives , 1985, Inf. Sci..

[21]  Tohru Nitta An Analysis of the Fundamental Structure of Complex-Valued Neurons , 2004, Neural Processing Letters.