Unsupervised feature selection based on the Morisita estimator of intrinsic dimension
暂无分享,去创建一个
[1] Antonino Staiano,et al. Intrinsic dimension estimation: Advances and open problems , 2016, Inf. Sci..
[2] Huan Liu,et al. Feature Selection for Clustering: A Review , 2018, Data Clustering: Algorithms and Applications.
[3] R Core Team,et al. R: A language and environment for statistical computing. , 2014 .
[4] Hongwei Hao,et al. Selecting feature subset with sparsity and low redundancy for unsupervised learning , 2015, Knowl. Based Syst..
[5] Fang Liu,et al. Unsupervised feature selection based on maximum information and minimum redundancy for hyperspectral images , 2016, Pattern Recognit..
[6] Jacob Cohen. A Coefficient of Agreement for Nominal Scales , 1960 .
[7] Seoung Bum Kim,et al. Unsupervised feature selection using weighted principal components , 2011, Expert Syst. Appl..
[8] M. Kanevski,et al. The multipoint Morisita index for the analysis of spatial patterns , 2013, 1307.3756.
[9] Kewei Cheng,et al. Feature Selection , 2016, ACM Comput. Surv..
[10] Juha Reunanen,et al. Overfitting in Making Comparisons Between Variable Selection Methods , 2003, J. Mach. Learn. Res..
[11] A. Wayne Whitney,et al. A Direct Method of Nonparametric Measurement Selection , 1971, IEEE Transactions on Computers.
[12] Masoud Nikravesh,et al. Feature Extraction - Foundations and Applications , 2006, Feature Extraction.
[13] Leo Breiman,et al. Random Forests , 2001, Machine Learning.
[14] Huan Liu,et al. Spectral Feature Selection for Data Mining , 2011 .
[15] C. D. Gelatt,et al. Optimization by Simulated Annealing , 1983, Science.
[16] Robert Tibshirani,et al. The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition , 2001, Springer Series in Statistics.
[17] S. J. Reeves. An efficient implementation of the backward greedy algorithm for sparse signal reconstruction , 1999, IEEE Signal Processing Letters.
[18] 森下. Measuring of dispersion of individuals and analysis of the distributional patterns. , 1961 .
[19] Laura Schweitzer,et al. Advances In Kernel Methods Support Vector Learning , 2016 .
[20] Karl Pearson F.R.S.. LIII. On lines and planes of closest fit to systems of points in space , 1901 .
[21] S. Hurlbert. Spatial distribution of the montane unicorn , 1990 .
[22] C. A. Murthy,et al. Unsupervised Feature Selection Using Feature Similarity , 2002, IEEE Trans. Pattern Anal. Mach. Intell..
[23] Thomas Demeester,et al. Knowledge base population using semantic label propagation , 2015, Knowl. Based Syst..
[24] Yoram Bresler,et al. Fast optimal and suboptimal algorithms for sparse solutions to linear inverse problems , 1998, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).
[25] Mikhail F. Kanevski,et al. Morisita-based feature selection for regression problems , 2015, ESANN.
[26] Chien-Hsing Chen. Feature selection for clustering using instance-based learning by exploring the nearest and farthest neighbors , 2015, Inf. Sci..
[27] Adolfo Martínez Usó,et al. Clustering-Based Hyperspectral Band Selection Using Information Measures , 2007, IEEE Transactions on Geoscience and Remote Sensing.
[28] Witold Pedrycz,et al. Unsupervised feature selection via maximum projection and minimum redundancy , 2015, Knowl. Based Syst..
[29] Wei Sun,et al. Multiple kernel dimensionality reduction via spectral regression and trace ratio maximization , 2015, Knowl. Based Syst..
[30] Sankar K. Pal,et al. Unsupervised Feature Selection , 2004 .
[31] Bernhard Schölkopf,et al. Kernel Principal Component Analysis , 1997, ICANN.
[32] Han Wang,et al. Unsupervised feature selection via low-rank approximation and structure learning , 2017, Knowl. Based Syst..
[33] Hiroshi Motoda,et al. Computational Methods of Feature Selection , 2022 .
[34] Mikhail F. Kanevski,et al. Feature Selection for Regression Problems Based on the Morisita Estimator of Intrinsic Dimension: Concept and Case Studies , 2016, Pattern Recognit..
[35] ChengXiang Zhai,et al. Robust Unsupervised Feature Selection , 2013, IJCAI.
[36] Chenxia Jin,et al. Feature selection with partition differentiation entropy for large-scale data sets , 2016, Inf. Sci..
[37] Samuel H. Huang,et al. Fractal-Based Intrinsic Dimension Estimation and Its Application in Dimensionality Reduction , 2012, IEEE Transactions on Knowledge and Data Engineering.
[38] Michel Verleysen,et al. Nonlinear Dimensionality Reduction , 2021, Computer Vision.
[39] Ron Kohavi,et al. Wrappers for Feature Subset Selection , 1997, Artif. Intell..
[40] P. Grassberger,et al. Measuring the Strangeness of Strange Attractors , 1983 .
[41] Simon C. K. Shiu,et al. Unsupervised feature selection by regularized self-representation , 2015, Pattern Recognit..
[42] Belén Melián-Batista,et al. High-dimensional feature selection via feature grouping: A Variable Neighborhood Search approach , 2016, Inf. Sci..
[43] Carla E. Brodley,et al. Unsupervised Feature Selection Applied to Content-Based Retrieval of Lung Images , 2003, IEEE Trans. Pattern Anal. Mach. Intell..
[44] D. Ruppert. The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2004 .
[45] Christos Faloutsos,et al. Fast feature selection using fractal dimension , 2010, J. Inf. Data Manag..
[46] Fuhui Long,et al. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy , 2003, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[47] Andy Liaw,et al. Classification and Regression by randomForest , 2007 .
[48] Deng Cai,et al. Unsupervised feature selection for multi-cluster data , 2010, KDD.
[49] Huan Liu,et al. Spectral feature selection for supervised and unsupervised learning , 2007, ICML '07.
[50] Mikhail F. Kanevski,et al. A new estimator of intrinsic dimension based on the multipoint Morisita index , 2014, Pattern Recognit..
[51] Christos Faloutsos,et al. A fast and effective method to find correlations among attributes in databases , 2007, Data Mining and Knowledge Discovery.
[52] Michele Volpi,et al. Semi-supervised multiview embedding for hyperspectral data classification , 2014, Neurocomputing.
[53] A. Asuncion,et al. UCI Machine Learning Repository, University of California, Irvine, School of Information and Computer Sciences , 2007 .
[54] Anil K. Jain,et al. Feature Selection: Evaluation, Application, and Small Sample Performance , 1997, IEEE Trans. Pattern Anal. Mach. Intell..
[55] Christos Faloutsos,et al. Fast Feature Selection using Fractal Dimension - Ten Years Later , 2010, J. Inf. Data Manag..
[56] Deng Cai,et al. Laplacian Score for Feature Selection , 2005, NIPS.
[57] Richard Bellman,et al. Adaptive Control Processes: A Guided Tour , 1961, The Mathematical Gazette.
[58] Ron Kohavi,et al. Irrelevant Features and the Subset Selection Problem , 1994, ICML.
[59] H. G. E. Hentschel,et al. The infinite number of generalized dimensions of fractals and strange attractors , 1983 .
[60] Wenli Xu,et al. Supervised feature subset selection with ordinal optimization , 2014, Knowl. Based Syst..
[61] Carla E. Brodley,et al. Feature Selection for Unsupervised Learning , 2004, J. Mach. Learn. Res..
[62] Tao Li,et al. Cost-sensitive feature selection using random forest: Selecting low-cost subsets of informative features , 2016, Knowl. Based Syst..