A framework for feature selection through boosting
暂无分享,去创建一个
George Azzopardi | Nicolai Petkov | Ahmad Alsahaf | Vikram Shenoy | N. Petkov | G. Azzopardi | A. Alsahaf | Vikram Shenoy
[1] Masashi Sugiyama,et al. High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso , 2012, Neural Computation.
[2] Stephen E. Fienberg,et al. Cost-Effective Feature Selection and Ordering for Personalized Energy Estimates , 2016, AAAI Workshop: AI for Smart Grids and Smart Buildings.
[3] Giorgos Borboudakis,et al. Forward-Backward Selection with Early Dropping , 2017, J. Mach. Learn. Res..
[4] Theodor Mader,et al. Feature Selection with the CLOP Package , 2006 .
[5] Marcin Luckner,et al. Application of XGBoost Algorithm in Fingerprinting Localisation Task , 2017, CISIM.
[6] Larry A. Rendell,et al. A Practical Approach to Feature Selection , 1992, ML.
[7] A. Castelletti,et al. Tree‐based iterative input variable selection for hydrological modeling , 2013 .
[8] Tianqi Chen,et al. XGBoost: A Scalable Tree Boosting System , 2016, KDD.
[9] Joshua Zhexue Huang,et al. Unbiased Feature Selection in Learning Random Forests for High-Dimensional Data , 2015, TheScientificWorldJournal.
[10] Jean-Michel Poggi,et al. Variable selection using random forests , 2010, Pattern Recognit. Lett..
[11] Sanmay Das,et al. Filters, Wrappers and a Boosting-Based Hybrid for Feature Selection , 2001, ICML.
[12] Antonino Staiano,et al. A Sparse-Modeling Based Approach for Class Specific Feature Selection , 2019, PeerJ Comput. Sci..
[13] Verónica Bolón-Canedo,et al. Ensembles for feature selection: A review and future trends , 2019, Inf. Fusion.
[14] Andreas Holzinger,et al. From Machine Learning to Explainable AI , 2018, 2018 World Symposium on Digital Intelligence for Systems and Machines (DISA).
[15] Randal S. Olson,et al. Relief-Based Feature Selection: Introduction and Review , 2017, J. Biomed. Informatics.
[16] Achim Zeileis,et al. BMC Bioinformatics BioMed Central Methodology article Conditional variable importance for random forests , 2008 .
[17] Jean Paul Barddal,et al. Boosting decision stumps for dynamic feature selection on data streams , 2019, Inf. Syst..
[18] Achim Zeileis,et al. Bias in random forest variable importance measures: Illustrations, sources and a solution , 2007, BMC Bioinformatics.
[19] Vipin Kumar,et al. Feature Selection: A literature Review , 2014, Smart Comput. Rev..
[20] George C. Runger,et al. Feature Selection with Ensembles, Artificial Variables, and Redundancy Elimination , 2009, J. Mach. Learn. Res..
[21] Witold R. Rudnicki,et al. Feature Selection with the Boruta Package , 2010 .
[22] Gilles Louppe,et al. Understanding variable importances in forests of randomized trees , 2013, NIPS.
[23] Günther Specht,et al. Detecting Music Genre Using Extreme Gradient Boosting , 2018, WWW.
[24] Ron Kohavi,et al. Wrappers for Feature Subset Selection , 1997, Artif. Intell..
[25] Musa Peker,et al. A novel hybrid method for determining the depth of anesthesia level: Combining ReliefF feature selection and random forest algorithm (ReliefF+RF) , 2015, 2015 International Symposium on Innovations in Intelligent SysTems and Applications (INISTA).
[26] Songfeng Lu,et al. Improved salp swarm algorithm based on particle swarm optimization for feature selection , 2018, Journal of Ambient Intelligence and Humanized Computing.
[27] Huan Liu,et al. Feature Selection for Classification: A Review , 2014, Data Classification: Algorithms and Applications.
[28] Nikola Bogunovic,et al. A review of feature selection methods with applications , 2015, 2015 38th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO).
[29] José Francisco Martínez Trinidad,et al. General framework for class-specific feature selection , 2011, Expert Syst. Appl..
[30] Ping Zhang,et al. Class-specific mutual information variation for feature selection , 2018, Pattern Recognit..
[31] Leo Breiman,et al. Random Forests , 2001, Machine Learning.
[32] Paul A. Viola,et al. Boosting Image Retrieval , 2004, International Journal of Computer Vision.
[33] Seyed Mohammad Mirjalili,et al. Whale optimization approaches for wrapper feature selection , 2018, Appl. Soft Comput..
[34] Laila Benhlima,et al. Review on wrapper feature selection approaches , 2016, 2016 International Conference on Engineering & MIS (ICEMIS).
[35] Mohamed Elhoseny,et al. Feature selection based on artificial bee colony and gradient boosting decision tree , 2019, Appl. Soft Comput..
[36] Age K Smilde,et al. A Critical Assessment of Feature Selection Methods for Biomarker Discovery in Clinical Proteomics* , 2012, Molecular & Cellular Proteomics.
[37] Qinbao Song,et al. A Fast Clustering-Based Feature Subset Selection Algorithm for High-Dimensional Data , 2013, IEEE Transactions on Knowledge and Data Engineering.
[38] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.
[39] Isabelle Guyon,et al. An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..
[40] Jaakko Astola,et al. Gene feature selection , 2004 .
[41] Lei Liu,et al. Boosting feature selection using information metric for classification , 2009, Neurocomputing.
[42] Fuhui Long,et al. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy , 2003, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[43] Huan Liu,et al. Feature Selection for Classification , 1997, Intell. Data Anal..
[44] Nazar Zaki,et al. Streaming feature selection algorithms for big data: A survey , 2020, Applied Computing and Informatics.