Compact Extreme Learning Machines for biological systems

In biological system modelling using data-driven black-box methods, it is essential to effectively and efficiently produce a parsimonious model to represent the system behaviour. The Extreme Learning Machine (ELM) is a recent development in fast learning paradigms. However, the derived model is not necessarily sparse. In this paper, an improved ELM is investigated, aiming to obtain a more compact model without significantly increasing the overall computational complexity. This is achieved by associating each model term to a regularized parameter, thus insignificant ones are automatically unselected, leading to improved model sparsity. Experimental results on biochemical data confirm its effectiveness.

[1]  P. Saratchandran,et al.  Multicategory Classification Using An Extreme Learning Machine for Microarray Gene Expression Cancer Diagnosis , 2007, IEEE/ACM Transactions on Computational Biology and Bioinformatics.

[2]  Chee Keong Kwoh,et al.  Extreme Learning Machine for Predicting HLA-Peptide Binding , 2006, ISNN.

[3]  George W. Irwin,et al.  A Novel Continuous Forward Algorithm for RBF Neural Modelling , 2007, IEEE Transactions on Automatic Control.

[4]  Narasimhan Sundararajan,et al.  Fully complex extreme learning machine , 2005, Neurocomputing.

[5]  William H. Press,et al.  The Art of Scientific Computing Second Edition , 1998 .

[6]  Guang-Bin Huang,et al.  Neuron selection for RBF neural network classifier based on data structure preserving criterion , 2005, IEEE Transactions on Neural Networks.

[7]  Charles L. Lawson,et al.  Solving least squares problems , 1976, Classics in applied mathematics.

[8]  Wei Wang,et al.  Internal Model Approach for Gait Modeling and Classification , 2005, 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference.

[9]  M. Korenberg Identifying nonlinear difference equation and functional expansion representations: The fast orthogonal algorithm , 2006, Annals of Biomedical Engineering.

[10]  Christopher M. Bishop,et al.  Neural networks for pattern recognition , 1995 .

[11]  Mohamad T. Musavi,et al.  On the training of radial basis function classifiers , 1992, Neural Networks.

[12]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[13]  Kevin Warwick,et al.  Mean-tracking clustering algorithm for radial basis function centre selection , 1997 .

[14]  Alan J. Miller,et al.  Subset Selection in Regression , 1991 .

[15]  Lei Chen,et al.  Enhanced random search based incremental extreme learning machine , 2008, Neurocomputing.

[16]  Narasimhan Sundararajan,et al.  Classification of Mental Tasks from Eeg Signals Using Extreme Learning Machine , 2006, Int. J. Neural Syst..

[17]  B. Kholodenko,et al.  Negative feedback and ultrasensitivity can bring about oscillations in the mitogen-activated protein kinase cascades. , 2000, European journal of biochemistry.

[18]  C. R. Rao,et al.  Generalized Inverse of Matrices and its Applications , 1972 .

[19]  Chng Eng Siong,et al.  Gradient radial basis function networks for nonlinear and nonstationary time series prediction , 1996, IEEE Trans. Neural Networks.

[20]  Byoung-Tak Zhang,et al.  Identification of biochemical networks by S-tree based genetic programming , 2006, Bioinform..

[21]  Visakan Kadirkamanathan,et al.  A Function Estimation Approach to Sequential Learning with Neural Networks , 1993, Neural Computation.

[22]  Stephen A. Billings,et al.  Radial Basis Function Network Configuration Using Mutual Information and the Orthogonal Least Squares Algorithm , 1996, Neural Networks.

[23]  J. Davies,et al.  Molecular Biology of the Cell , 1983, Bristol Medico-Chirurgical Journal.

[24]  Amit Agarwal,et al.  A new machine learning paradigm for terrain reconstruction , 2006, IEEE Geoscience and Remote Sensing Letters.

[25]  Kang Li,et al.  Two-Stage Mixed Discrete–Continuous Identification of Radial Basis Function (RBF) Neural Models for Nonlinear Systems , 2009, IEEE Transactions on Circuits and Systems I: Regular Papers.

[26]  Alan J. Miller Subset Selection in Regression , 1992 .

[27]  John C. Platt A Resource-Allocating Network for Function Interpolation , 1991, Neural Computation.

[28]  Stephen A. Billings,et al.  An alternative solution to the model structure selection problem , 2001, IEEE Trans. Syst. Man Cybern. Part A.

[29]  Jehoshua Bruck,et al.  Scaffold proteins may biphasically affect the levels of mitogen-activated protein kinase signaling and reduce its threshold properties. , 2000, Proceedings of the National Academy of Sciences of the United States of America.

[30]  D. Serre Matrices: Theory and Applications , 2002 .

[31]  Narasimhan Sundararajan,et al.  A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation , 2005, IEEE Transactions on Neural Networks.

[32]  U. Alon,et al.  Assigning numbers to the arrows: Parameterizing a gene regulation network by using accurate expression kinetics , 2002, Proceedings of the National Academy of Sciences of the United States of America.

[33]  Dianhui Wang,et al.  Protein sequence classification using extreme learning machine , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[34]  Narasimhan Sundararajan,et al.  A Fast and Accurate Online Sequential Learning Algorithm for Feedforward Networks , 2006, IEEE Transactions on Neural Networks.

[35]  Han Tong Loh,et al.  Comparison of Extreme Learning Machine with Support Vector Machine for Text Classification , 2005, IEA/AIE.

[36]  Shang-Liang Chen,et al.  Orthogonal least squares learning algorithm for radial basis function networks , 1991, IEEE Trans. Neural Networks.

[37]  De-Shuang Huang,et al.  A Hybrid Forward Algorithm for RBF Neural Network Construction , 2006, IEEE Transactions on Neural Networks.

[38]  George W. Irwin,et al.  A New Jacobian Matrix for Optimal Learning of Single-Layer Neural Networks , 2008, IEEE Transactions on Neural Networks.

[39]  H. Kitano Systems Biology: A Brief Overview , 2002, Science.

[40]  Kwang-Hyun Cho,et al.  Systems biology: Looking at opportunities and challenges in applying systems theory to molecular and cell biology , 2003 .

[41]  Y Lu,et al.  A Sequential Learning Scheme for Function Approximation Using Minimal Radial Basis Function Neural Networks , 1997, Neural Computation.

[42]  Chee Kheong Siew,et al.  Universal Approximation using Incremental Constructive Feedforward Networks with Random Hidden Nodes , 2006, IEEE Transactions on Neural Networks.

[43]  Guangbin Huang,et al.  A Fast Construction Algorithm for Feedforward Neural Networks , 2012 .

[44]  K. S. Banerjee Generalized Inverse of Matrices and Its Applications , 1973 .

[45]  Fuwen Yang,et al.  Stochastic Dynamic Modeling of Short Gene Expression Time-Series Data , 2008, IEEE Transactions on NanoBioscience.

[46]  George W. Irwin,et al.  A fast nonlinear model identification method , 2005, IEEE Transactions on Automatic Control.

[47]  Zidong Wang,et al.  An Extended Kalman Filtering Approach to Modeling Nonlinear Dynamic Gene Regulatory Networks via Short Gene Expression Time Series , 2009, IEEE/ACM Transactions on Computational Biology and Bioinformatics.

[48]  D. Marquardt An Algorithm for Least-Squares Estimation of Nonlinear Parameters , 1963 .

[49]  Mohammad Bagher Menhaj,et al.  Training feedforward networks with the Marquardt algorithm , 1994, IEEE Trans. Neural Networks.

[50]  Chee Kheong Siew,et al.  Can threshold networks be trained directly? , 2006, IEEE Transactions on Circuits and Systems II: Express Briefs.

[51]  F. A. Seiler,et al.  Numerical Recipes in C: The Art of Scientific Computing , 1989 .

[52]  George W. Irwin,et al.  Modelling molecular interaction pathways using a two-stage identification algorithm , 2008, Systems and Synthetic Biology.

[53]  Guang-Bin Huang,et al.  Convex Incremental Extreme Learning Machine , 2007 .