New classifier based on compressed dictionary and LS-SVM

Inspired by the compressive sensing (CS) theory, a new classifier based on compressed dictionary and Least Squares Support Vector Machine (LS-SVM) is proposed to deal with large scale problems. The coefficients of support vectors can be recovered from a few measurements if LS-SVM is approximated to sparse structure. Using the known Cholesky decomposition, we approximate the given kernel matrix to represent the coefficients of support vectors sparsely by a low-rank matrix that we have used as a dictionary. The proposed measurement matrix being coupled with the dictionary forms a compressed dictionary that proves to satisfy the restricted isometry property (RIP). Our classifier has the quality of low storage and computational complexity, high degree of sparsity and information preservation. Experiments on benchmark data sets show that our classifier has positive performance. A classifier using the framework of compressive sensing is proposed.Sparse solutions and compressed version of signals are sought directly.A compressed dictionary is constructed to deal with large scale problems.The mechanism is operated without casting any prior hypothesis on data.

[1]  Johan A. K. Suykens,et al.  Optimized fixed-size kernel models for large data sets , 2010, Comput. Stat. Data Anal..

[2]  Isabel Praça,et al.  Support Vector Machines for decision support in electricity markets' strategic bidding , 2016, Neurocomputing.

[3]  Gitta Kutyniok,et al.  1 . 2 Sparsity : A Reasonable Assumption ? , 2012 .

[4]  Johan A. K. Suykens,et al.  Sparse kernel models for spectral clustering using the incomplete Cholesky decomposition , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[5]  Johan A. K. Suykens,et al.  Very Sparse LSSVM Reductions for Large-Scale Data , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[6]  Lawrence D. Jackel,et al.  Backpropagation Applied to Handwritten Zip Code Recognition , 1989, Neural Computation.

[7]  Theo J. A. de Vries,et al.  Pruning error minimization in least squares support vector machines , 2003, IEEE Trans. Neural Networks.

[8]  Michael B. Wakin,et al.  Analysis of Orthogonal Matching Pursuit Using the Restricted Isometry Property , 2009, IEEE Transactions on Information Theory.

[9]  Yonina C. Eldar,et al.  Compressed Sensing with Coherent and Redundant Dictionaries , 2010, ArXiv.

[10]  Rui Zhang,et al.  Sparse least square support vector machine via coupled compressive pruning , 2014, Neurocomputing.

[11]  Joel A. Tropp,et al.  Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit , 2007, IEEE Transactions on Information Theory.

[12]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machines , 2002 .

[13]  Samy Bengio,et al.  A Parallel Mixture of SVMs for Very Large Scale Problems , 2001, Neural Computation.

[14]  Fang Liu,et al.  Nonconvex Compressed Sensing by Nature-Inspired Optimization Algorithms , 2015, IEEE Transactions on Cybernetics.

[15]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[16]  Shuyuan Yang,et al.  Distributed compressed sensing-based pan-sharpening with hybrid dictionary , 2015, Neurocomputing.

[17]  Wei-Yin Loh,et al.  Classification and regression trees , 2011, WIREs Data Mining Knowl. Discov..

[18]  Ankur Agarwal,et al.  Recovering 3D human pose from monocular images , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[19]  David J. Spiegelhalter,et al.  Machine Learning, Neural and Statistical Classification , 2009 .

[20]  Johan A. K. Suykens,et al.  Sparse conjugate directions pursuit with application to fixed-size kernel models , 2011, Machine Learning.

[21]  Guillermo Sapiro,et al.  Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization , 2009, IEEE Transactions on Image Processing.

[22]  Licheng Jiao,et al.  Fast Sparse Approximation for Least Squares Support Vector Machine , 2007, IEEE Transactions on Neural Networks.

[23]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.

[24]  Ching Y. Suen,et al.  A novel hybrid CNN-SVM classifier for recognizing handwritten digits , 2012, Pattern Recognit..

[25]  L. Breiman Arcing classifier (with discussion and a rejoinder by the author) , 1998 .

[26]  Michael I. Jordan,et al.  Predictive low-rank decomposition for kernel methods , 2005, ICML.

[27]  Bing Li,et al.  Householder transformation based sparse least squares support vector regression , 2015, Neurocomputing.

[28]  John C. Platt,et al.  Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .

[29]  Fei Wang,et al.  Fast semi-supervised clustering with enhanced spectral embedding , 2012, Pattern Recognit..

[30]  Ajalmar R. da Rocha Neto,et al.  Novel approaches using evolutionary computation for sparse least square support vector machines , 2015, Neurocomputing.

[31]  Fang Liu,et al.  Coupled compressed sensing inspired sparse spatial-spectral LSSVM for hyperspectral image classification , 2015, Knowl. Based Syst..

[32]  Katya Scheinberg,et al.  Efficient SVM Training Using Low-Rank Kernel Representations , 2002, J. Mach. Learn. Res..

[33]  Johan A. K. Suykens,et al.  Bayesian Framework for Least-Squares Support Vector Machine Classifiers, Gaussian Processes, and Kernel Fisher Discriminant Analysis , 2002, Neural Computation.

[34]  Gene H. Golub,et al.  Matrix computations , 1983 .

[35]  Zhenhua Li,et al.  A new big data storage architecture with intrinsic search engines , 2016, Neurocomputing.

[36]  Pierre Vandergheynst,et al.  Compressed Sensing and Redundant Dictionaries , 2007, IEEE Transactions on Information Theory.

[37]  Johan A. K. Suykens,et al.  Sparse approximation using least squares support vector machines , 2000, 2000 IEEE International Symposium on Circuits and Systems. Emerging Technologies for the 21st Century. Proceedings (IEEE Cat No.00CH36353).

[38]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[39]  Xiang-Yan Zeng,et al.  SMO-based pruning methods for sparse least squares support vector machines , 2005, IEEE Transactions on Neural Networks.

[40]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.