A Review on Dimensionality Reduction Techniques

High-dimensional data is ubiquitous in scientific research and industrial production fields. It brings a lot of information to people, at the same time, because of its sparse and redundancy, it als...

[1]  Petros Drineas,et al.  Feature selection for linear SVM with provable guarantees , 2014, Pattern Recognit..

[2]  Duoqian Miao,et al.  A rough set approach to feature selection based on ant colony optimization , 2010, Pattern Recognit. Lett..

[3]  Jürgen Schmidhuber,et al.  Deep learning in neural networks: An overview , 2014, Neural Networks.

[4]  Peter C. Y. Chen,et al.  Hierarchical discriminant manifold learning for dimensionality reduction and image classification , 2015, J. Electronic Imaging.

[5]  Jason Weston,et al.  Gene Selection for Cancer Classification using Support Vector Machines , 2002, Machine Learning.

[6]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[7]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[8]  K. Thangavel,et al.  Dimensionality reduction based on rough set theory: A review , 2009, Appl. Soft Comput..

[9]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[10]  H. Zou,et al.  Regularization and variable selection via the elastic net , 2005 .

[11]  Dao-Qiang Zhang,et al.  Experimental Comparisons of Semi-Supervised Dimensional Reduction Methods: Experimental Comparisons of Semi-Supervised Dimensional Reduction Methods , 2011 .

[12]  Patrik O. Hoyer,et al.  Non-negative Matrix Factorization with Sparseness Constraints , 2004, J. Mach. Learn. Res..

[13]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[14]  Peter Bühlmann Regression shrinkage and selection via the Lasso: a retrospective (Robert Tibshirani): Comments on the presentation , 2011 .

[15]  R. Tibshirani,et al.  Regression shrinkage and selection via the lasso: a retrospective , 2011 .

[16]  Rossitza Setchi,et al.  Feature selection using Joint Mutual Information Maximisation , 2015, Expert Syst. Appl..

[17]  Hujun Yin,et al.  Nonlinear dimensionality reduction and data visualization: A review , 2007, Int. J. Autom. Comput..

[18]  Smriti Srivastava,et al.  Feature Extraction Methods for Speaker Recognition: A Review , 2017, Int. J. Pattern Recognit. Artif. Intell..

[19]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[20]  Qinghua Hu,et al.  Feature selection with test cost constraint , 2012, ArXiv.

[21]  Teuvo Kohonen,et al.  The self-organizing map , 1990 .

[22]  Jiehua Zhu,et al.  Manifold learning: Dimensionality reduction and high dimensional data reconstruction via dictionary learning , 2016, Neurocomputing.

[23]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[24]  Duncan Fyfe Gillies,et al.  A Review of Feature Selection and Feature Extraction Methods Applied on Microarray Data , 2015, Adv. Bioinformatics.

[25]  Dae-Won Kim,et al.  Mutual Information-based multi-label feature selection using interaction information , 2015, Expert Syst. Appl..

[26]  Xin Yao,et al.  A Survey on Evolutionary Computation Approaches to Feature Selection , 2016, IEEE Transactions on Evolutionary Computation.

[27]  Barbara Hammer,et al.  Data visualization by nonlinear dimensionality reduction , 2015, WIREs Data Mining Knowl. Discov..

[28]  Michael E. Tipping,et al.  Probabilistic Principal Component Analysis , 1999 .

[29]  Bor-Chen Kuo,et al.  A Kernel-Based Feature Selection Method for SVM With RBF Kernel for Hyperspectral Image Classification , 2014, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing.

[30]  David Casasent,et al.  An improvement on floating search algorithms for feature subset selection , 2009, Pattern Recognit..

[31]  Lei Yu,et al.  Sparse multiple maximum scatter difference for dimensionality reduction , 2017, Digit. Signal Process..

[32]  Ivor W. Tsang,et al.  A Feature Selection Method for Multivariate Performance Measures , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[33]  Pablo A. Estévez,et al.  A review of feature selection methods based on mutual information , 2013, Neural Computing and Applications.

[34]  Gabriele Steidl,et al.  Combined SVM-Based Feature Selection and Classification , 2005, Machine Learning.

[35]  Josef Kittler,et al.  Floating search methods in feature selection , 1994, Pattern Recognit. Lett..

[36]  D. Donoho,et al.  Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[37]  Kazuyuki Murase,et al.  A new wrapper feature selection approach using neural network , 2010, Neurocomputing.

[38]  Robert Jenssen,et al.  Kernel Entropy Component Analysis , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[39]  Beatriz de la Iglesia,et al.  Evolutionary computation for feature selection in classification problems , 2013, WIREs Data Mining Knowl. Discov..

[40]  Christopher M. Bishop,et al.  GTM: The Generative Topographic Mapping , 1998, Neural Computation.

[41]  Yiu-Ming Cheung,et al.  Discriminant Manifold Learning via Sparse Coding for Robust Feature Extraction , 2017, IEEE Access.

[42]  Dae-Won Kim,et al.  Feature selection for multi-label classification using multivariate mutual information , 2013, Pattern Recognit. Lett..

[43]  Yao Zhao,et al.  A dimensionality reduction method based on structured sparse representation for face recognition , 2016, Artificial Intelligence Review.

[44]  Zehang Sun,et al.  Object detection using feature subset selection , 2004, Pattern Recognit..