Combined nonlinear visualization and classification: ELMVIS++C

This paper presents an improvement of the ELMVIS+ method that is proposed for fast nonlinear dimensionality reduction. The ELMVIS++C has an additional supervised learning component compared to ELMVIS+, which is originally an unsupervised method as like the majority of the other dimensionality reduction method. This component prevents samples under the same class being separated apart from each other. In this improved method, the importance of the supervised component can be further tuned to have different level of influence. The test results on four datasets indicate that the proposed improvement not only maintains the performance of ELMVIS+, but also is extremely beneficial for certain applications where the visualization of the data in relation with the class becomes an important issue.

[1]  Amaury Lendasse,et al.  Evolving fuzzy optimally pruned extreme learning machine for regression problems , 2010, Evol. Syst..

[2]  Mauro Dell'Amico,et al.  Assignment Problems , 1998, IFIP Congress: Fundamentals - Foundations of Computer Science.

[3]  Amaury Lendasse,et al.  Linear Projection based on Noise Variance Estimation - Application to Spectral Data , 2008, ESANN.

[4]  Amaury Lendasse,et al.  TROP-ELM: A double-regularized ELM using LARS and Tikhonov regularization , 2011, Neurocomputing.

[5]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[6]  Kunwoo Lee,et al.  Parametric human body shape modeling framework for human-centered product design , 2012, Comput. Aided Des..

[7]  Michel Verleysen,et al.  Nonlinear Dimensionality Reduction , 2021, Computer Vision.

[8]  Hongming Zhou,et al.  Extreme Learning Machine for Regression and Multiclass Classification , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[9]  John W. Sammon,et al.  A Nonlinear Mapping for Data Structure Analysis , 1969, IEEE Transactions on Computers.

[10]  Guy Cazuguel,et al.  FEEDBACK ON A PUBLICLY DISTRIBUTED IMAGE DATABASE: THE MESSIDOR DATABASE , 2014 .

[11]  Teuvo Kohonen,et al.  Self-organized formation of topologically correct feature maps , 2004, Biological Cybernetics.

[12]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[13]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[14]  Heng Tao Shen,et al.  Principal Component Analysis , 2009, Encyclopedia of Biometrics.

[15]  Chee Kheong Siew,et al.  Universal Approximation using Incremental Constructive Feedforward Networks with Random Hidden Nodes , 2006, IEEE Transactions on Neural Networks.

[16]  J. Kruskal Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis , 1964 .

[17]  Amaury Lendasse,et al.  X-SOM and L-SOM: A double classification approach for missing value imputation , 2010, Neurocomputing.

[18]  Amaury Lendasse,et al.  High-Performance Extreme Learning Machines: A Complete Toolbox for Big Data Applications , 2015, IEEE Access.

[19]  Amaury Lendasse,et al.  Bankruptcy prediction using Extreme Learning Machine and financial expertise , 2014, Neurocomputing.

[20]  Victor C. M. Leung,et al.  Extreme Learning Machines [Trends & Controversies] , 2013, IEEE Intelligent Systems.

[21]  Amaury Lendasse,et al.  OP-ELM: Optimally Pruned Extreme Learning Machine , 2010, IEEE Transactions on Neural Networks.

[22]  Michel Verleysen,et al.  Prediction of electric load using Kohonen maps - Application to the Polish electricity consumption , 2002, Proceedings of the 2002 American Control Conference (IEEE Cat. No.CH37301).

[23]  Amaury Lendasse,et al.  ELMVIS+: Fast nonlinear visualization technique based on cosine distance and extreme learning machines , 2016, Neurocomputing.

[24]  Amaury Lendasse,et al.  Long-term time series prediction using OP-ELM , 2014, Neural Networks.

[25]  A. Asuncion,et al.  UCI Machine Learning Repository, University of California, Irvine, School of Information and Computer Sciences , 2007 .

[26]  Matemática,et al.  Society for Industrial and Applied Mathematics , 2010 .

[27]  Amaury Lendasse,et al.  Proceedings of ELM-2015 Volume 2: Theory, Algorithms and Applications (II) , 2016 .

[28]  Jarkko Venna,et al.  Information Retrieval Perspective to Nonlinear Dimensionality Reduction for Data Visualization , 2010, J. Mach. Learn. Res..

[29]  Calyampudi R. Rao,et al.  Generalized inverse of a matrix and its applications , 1972 .

[30]  Joshua B. Tenenbaum,et al.  Mapping a Manifold of Perceptual Observations , 1997, NIPS.

[31]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[32]  Bálint Antal,et al.  An ensemble-based system for automatic screening of diabetic retinopathy , 2014, Knowl. Based Syst..

[33]  Donald D. Lucas,et al.  Failure analysis of parameter-induced simulation crashes in climate models , 2013 .

[34]  Amaury Lendasse,et al.  Time series forecasting with SOM and local non-linear models - Application to the DAX30 index prediction , 2003 .

[35]  Hongming Zhou,et al.  Extreme Learning Machines [Trends & Controversies] , 2013 .

[36]  Chee Kheong Siew,et al.  Can threshold networks be trained directly? , 2006, IEEE Transactions on Circuits and Systems II: Express Briefs.

[37]  Christopher M. Bishop,et al.  GTM: The Generative Topographic Mapping , 1998, Neural Computation.

[38]  Niklas Peinecke,et al.  Laplace-Beltrami spectra as 'Shape-DNA' of surfaces and solids , 2006, Comput. Aided Des..

[39]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.