On the investigation of activation functions in gradient neural network for online solving linear matrix equation

Abstract In this paper, we investigate different activation functions (AFs) on convergence performance of a gradient-based neural network (GNN) for solving linear matrix equation, AXB + X = C . It is observed that, by employing different AFs, i.e., linear, power-sigmoid, sign-power, and general sign-bi-power functions, the presented GNN model can achieve different convergence performance. More specifically, if linear function is employed, the GNN model can achieve exponential convergence; if the power-sigmoid function is employed, superior convergence can be achieved as compared to the linear case; while if the sign-power and general sign-bi-power functions are employed, the GNN model can achieve finite- and fixed-time convergence, respectively. Detailed theoretical proofs are offered to demonstrate these facts. Besides, the exponential convergence rate and the upper bounds of finite and fixed convergence time are also theoretically estimated. Finally, two illustrative examples are performed to further substantiate the aforementioned theoretical results and the effectiveness of the presented GNN model for solving the linear matrix equation.

[1]  Shuzhi Sam Ge,et al.  Design and analysis of a general recurrent neural network model for time-varying matrix inversion , 2005, IEEE Transactions on Neural Networks.

[2]  Ting-Zhu Huang,et al.  Restarted global FOM and GMRES algorithms for the Stein-like matrix equation X+M(X)=C , 2019, Appl. Math. Comput..

[3]  Shuai Li,et al.  Nonlinear recurrent neural networks for finite-time solution of general time-varying linear matrix equations , 2018, Neural Networks.

[4]  Lixiang Li,et al.  Fixed-time synchronization of inertial memristor-based neural networks with discrete delay , 2019, Neural Networks.

[5]  Lin Xiao,et al.  Improved Gradient Neural Networks for Solving Moore–Penrose Inverse of Full-Rank Matrix , 2019, Neural Processing Letters.

[6]  Lin Xiao,et al.  A convergence-enhanced gradient neural network for solving Sylvester equation , 2017, 2017 36th Chinese Control Conference (CCC).

[7]  Shuai Li,et al.  Accelerating a Recurrent Neural Network to Finite-Time Convergence for Solving Time-Varying Sylvester Equation by Using a Sign-Bi-power Activation Function , 2012, Neural Processing Letters.

[8]  Min Wang,et al.  An exponential-enhanced-type varying-parameter RNN for solving time-varying matrix inversion , 2019, Neurocomputing.

[9]  Lin Xiao,et al.  Wsbp function activated Zhang dynamic with finite-time convergence applied to Lyapunov equation , 2018, Neurocomputing.

[10]  Jun Wang,et al.  A Collaborative Neurodynamic Approach to Multiobjective Optimization , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[11]  Ke Chen Improved neural dynamics for online Sylvester equations solving , 2016, Inf. Process. Lett..

[12]  Andrey Polyakov,et al.  Nonlinear Feedback Design for Fixed-Time Stabilization of Linear Control Systems , 2012, IEEE Transactions on Automatic Control.

[13]  Dongsheng Guo,et al.  Zhang neural network versus gradient-based neural network for time-varying linear matrix equation solving , 2011, Neurocomputing.

[14]  Jinde Cao,et al.  Fixed-time synchronization of delayed memristor-based recurrent neural networks , 2017, Science China Information Sciences.

[15]  Predrag S. Stanimirovic,et al.  Gradient neural dynamics for solving matrix equations and their applications , 2018, Neurocomputing.

[16]  Kenli Li,et al.  Nonlinear gradient neural network for solving system of linear equations , 2019, Inf. Process. Lett..

[17]  Lin Xiao,et al.  A recurrent neural network with predefined-time convergence and improved noise tolerance for dynamic matrix square root finding , 2019, Neurocomputing.

[18]  Ke Chen,et al.  Robustness analysis of a hybrid of recursive neural dynamics for online matrix inversion , 2016, Appl. Math. Comput..

[19]  Charles R. Johnson,et al.  Topics in Matrix Analysis , 1991 .

[20]  Jun Wang,et al.  A Two-Timescale Duplex Neurodynamic Approach to Biconvex Optimization , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[21]  Jiawei Luo,et al.  A Strictly Predefined-Time Convergent Neural Solution to Equality- and Inequality-Constrained Time-Variant Quadratic Programming , 2021, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[22]  Jun Wang,et al.  A recurrent neural network for solving Sylvester equation with time-varying coefficients , 2002, IEEE Trans. Neural Networks.

[23]  Long Cheng,et al.  A Simplified Neural Network for Linear Matrix Inequality Problems , 2009, Neural Processing Letters.

[24]  Shuai Li,et al.  Nonconvex projection activated zeroing neurodynamic models for time-varying matrix pseudoinversion with accelerated finite-time convergence , 2018, Appl. Soft Comput..

[25]  Chenfu Yi,et al.  Improved gradient-based neural networks for online solution of Lyapunov matrix equation , 2011, Inf. Process. Lett..

[26]  Yunong Zhang,et al.  Convergence Properties Analysis of Gradient Neural Network for Solving Online Linear Equations , 2009 .

[27]  Shuai Li,et al.  A nonlinear and noise-tolerant ZNN model solving for time-varying linear matrix equation , 2018, Neurocomputing.