The novel aggregation function-based neuron models in complex domain

The computational power of a neuron lies in the spatial grouping of synapses belonging to any dendrite tree. Attempts to give a mathematical representation to the grouping process of synapses continue to be a fascinating field of work for researchers in the neural network community. In the literature, we generally find neuron models that comprise of summation, radial basis or product aggregation function, as basic unit of feed-forward multilayer neural network. All these models and their corresponding networks have their own merits and demerits. The MLP constructs global approximation to input–output mapping, while a RBF network, using exponentially decaying localized non-linearity, constructs local approximation to input–output mapping. In this paper, we propose two compensatory type novel aggregation functions for artificial neurons. They produce net potential as linear or non-linear composition of basic summation and radial basis operations over a set of input signals. The neuron models based on these aggregation functions ensure faster convergence, better training and prediction accuracy. The learning and generalization capabilities of these neurons have been tested over various classification and functional mapping problems. These neurons have also shown excellent generalization ability over the two-dimensional transformations.

[1]  Cris Koutsougeras,et al.  Complex domain backpropagation , 1992 .

[2]  Francesco Piazza,et al.  On the complex backpropagation algorithm , 1992, IEEE Trans. Signal Process..

[3]  Joos Vandewalle,et al.  Multi-Valued and Universal Binary Neurons , 2000 .

[4]  Akira Hirose,et al.  Complex-Valued Neural Networks , 2006, Studies in Computational Intelligence.

[5]  Narasimhan Sundararajan,et al.  Communication channel equalization using complex-valued minimal radial basis function neural networks , 2002, IEEE Trans. Neural Networks.

[6]  Chein-I Chang,et al.  Robust radial basis function neural networks , 1999, IEEE Trans. Syst. Man Cybern. Part B.

[7]  Bartlett W. Mel,et al.  Information Processing in Dendritic Trees , 1994, Neural Computation.

[8]  Christof Koch,et al.  Biophysics of Computation: Information Processing in Single Neurons (Computational Neuroscience Series) , 1998 .

[9]  Terrence J. Sejnowski,et al.  Analysis of hidden units in a layered network trained to classify sonar targets , 1988, Neural Networks.

[10]  Madan M. Gupta,et al.  Static and Dynamic Neural Networks: From Fundamentals to Advanced Theory , 2003 .

[11]  David B. Fogel An information criterion for optimal neural network selection , 1991, IEEE Trans. Neural Networks.

[12]  Devendra K. Chaturvedi,et al.  New neuron models for simulating rotating electrical machines and load forecasting problems , 1999 .

[13]  Tohru Nitta An Analysis of the Fundamental Structure of Complex-Valued Neurons , 2004, Neural Processing Letters.

[14]  D. J. Newman,et al.  UCI Repository of Machine Learning Database , 1998 .

[15]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[16]  Joos Vandewalle,et al.  Multi-Valued and Universal Binary Neurons: Theory, Learning and Applications , 2012 .

[17]  Tohru Nitta,et al.  An Extension of the Back-Propagation Algorithm to Complex Numbers , 1997, Neural Networks.

[18]  Claudio Moraga,et al.  Multilayer Feedforward Neural Network Based on Multi-valued Neurons (MLMVN) and a Backpropagation Learning Algorithm , 2006, Soft Comput..

[19]  Henry Leung,et al.  The complex backpropagation algorithm , 1991, IEEE Trans. Signal Process..