On the learning machine with compensatory aggregation based neurons in quaternionic domain

Abstract The nonlinear spatial grouping process of synapses is one of the fascinating methodologies for neuro-computing researchers to achieve the computational power of a neuron. Generally, researchers use neuron models that are based on summation (linear), product (linear) or radial basis (nonlinear) aggregation for the processing of synapses, to construct multi-layered feed-forward neural networks, but all these neuron models and their corresponding neural networks have their advantages or disadvantages. The multi-layered network generally uses for accomplishing the global approximation of input–output mapping but sometimes getting stuck into local minima, while the nonlinear radial basis function (RBF) network is based on exponentially decaying that uses for local approximation to input–output mapping. Their advantages and disadvantages motivated to design two new artificial neuron models based on compensatory aggregation functions in the quaternionic domain. The net internal potentials of these neuron models are developed with the compositions of basic summation (linear) and radial basis (nonlinear) operations on quaternionic-valued input signals. The neuron models based on these aggregation functions ensure faster convergence, better training, and prediction accuracy. The learning and generalization capabilities of these neurons are verified through various three-dimensional transformations and time series predictions as benchmark problems.

[1]  Bipin Kumar Tripathi,et al.  On the complex domain deep machine learning for face recognition , 2017, Applied Intelligence.

[2]  Bipin Kumar Tripathi,et al.  High Dimensional Neurocomputing - Growth, Appraisal and Applications , 2015, Studies in Computational Intelligence.

[3]  Nobuyuki Matsui,et al.  Quaternion neural network with geometrical operators , 2004, J. Intell. Fuzzy Syst..

[4]  E. Lorenz Deterministic nonperiodic flow , 1963 .

[5]  Christof Koch,et al.  Biophysics of Computation: Information Processing in Single Neurons (Computational Neuroscience Series) , 1998 .

[6]  T. Nitta,et al.  Three-Dimensional Vector Valued Neural Network and its Generalization Ability , 2006 .

[7]  Leon O. Chua,et al.  The double scroll , 1985 .

[8]  Danilo P. Mandic,et al.  Quaternion-Valued Nonlinear Adaptive Filtering , 2011, IEEE Transactions on Neural Networks.

[9]  T. Nitta,et al.  A quaternary version of the back-propagation algorithm , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.

[10]  Sushil Kumar,et al.  Root-Power Mean Aggregation-Based Neuron in Quaternionic Domain , 2019 .

[11]  Tülay Adali,et al.  Approximation by Fully Complex Multilayer Perceptrons , 2003, Neural Computation.

[12]  Xiaoming Chen,et al.  An Modified Error Function for the Complex-value Backpropagation Neural Networks , 2005 .

[13]  Sushil Kumar,et al.  Machine Learning with Resilient Propagation in Quaternionic Domain , 2017 .

[14]  W. Pitts,et al.  A Logical Calculus of the Ideas Immanent in Nervous Activity (1943) , 2021, Ideas That Created the Future.

[15]  Calin-Adrian Popa,et al.  Scaled Conjugate Gradient Learning for Quaternion-Valued Neural Networks , 2016, ICONIP.

[16]  Bipin Kumar Tripathi,et al.  High-dimensional information processing through resilient propagation in quaternionic domain , 2018 .

[17]  Richard D. Neidinger,et al.  Introduction to Automatic Differentiation and MATLAB Object-Oriented Programming , 2010, SIAM Rev..

[18]  Francesco Piazza,et al.  On the complex backpropagation algorithm , 1992, IEEE Trans. Signal Process..

[19]  Chein-I Chang,et al.  Robust radial basis function neural networks , 1999, IEEE Trans. Syst. Man Cybern. Part B.

[20]  Bartlett W. Mel,et al.  Information Processing in Dendritic Trees , 1994, Neural Computation.

[21]  Devendra K. Chaturvedi,et al.  New neuron models for simulating rotating electrical machines and load forecasting problems , 1999 .

[22]  Tohru Nitta An Analysis of the Fundamental Structure of Complex-Valued Neurons , 2004, Neural Processing Letters.

[23]  Giovanni Muscato,et al.  Multilayer Perceptrons to Approximate Quaternion Valued Functions , 1997, Neural Networks.

[24]  Akira Hirose,et al.  Complex-Valued Neural Networks , 2006, Studies in Computational Intelligence.

[25]  Nobuyuki Matsui,et al.  Feed forward neural network with random quaternionic neurons , 2017, Signal Process..

[26]  Prem Kumar Kalra,et al.  On the learning machine for three dimensional mapping , 2011, Neural Computing and Applications.

[27]  William Rowan Hamilton,et al.  Lectures on quaternions , 1853 .

[28]  Prem Kumar Kalra,et al.  On Efficient Learning Machine With Root-Power Mean Neuron in Complex Domain , 2011, IEEE Transactions on Neural Networks.

[29]  P. Arena,et al.  Quaternionic Multilayer Perceptrons for Chaotic Time Series Prediction , 1996 .

[30]  Tohru Nitta,et al.  An Extension of the Back-Propagation Algorithm to Complex Numbers , 1997, Neural Networks.

[31]  Madan M. Gupta,et al.  Static and Dynamic Neural Networks: From Fundamentals to Advanced Theory , 2003 .

[32]  David B. Fogel An information criterion for optimal neural network selection , 1991, IEEE Trans. Neural Networks.

[33]  Henry Leung,et al.  The complex backpropagation algorithm , 1991, IEEE Trans. Signal Process..