Information‐Theoretic Gene Selection In Expression Data
暂无分享,去创建一个
[1] A. S. Weigend,et al. Selecting Input Variables Using Mutual Information and Nonparemetric Density Estimation , 1994 .
[2] Huan Liu,et al. Discretization: An Enabling Technique , 2002, Data Mining and Knowledge Discovery.
[3] Michel Verleysen,et al. Resampling methods for parameter-free and robust feature selection with mutual information , 2007, Neurocomputing.
[4] Ron Kohavi,et al. Supervised and Unsupervised Discretization of Continuous Features , 1995, ICML.
[5] Gregory M. Provan,et al. Learning Bayesian Networks Using Feature Selection , 1995, AISTATS.
[6] Geoffrey I. Webb,et al. On Why Discretization Works for Naive-Bayes Classifiers , 2003, Australian Conference on Artificial Intelligence.
[7] Gianluca Bontempi,et al. On the Use of Variable Complementarity for Feature Selection in Cancer Classification , 2006, EvoWorkshops.
[8] Huan Liu,et al. Searching for Interacting Features , 2007, IJCAI.
[9] Geoffrey I. Webb,et al. Discretization for naive-Bayes learning: managing discretization bias and variance , 2008, Machine Learning.
[10] Pat Langley,et al. Selection of Relevant Features and Examples in Machine Learning , 1997, Artif. Intell..
[11] M K Markey,et al. Application of the mutual information criterion for feature selection in computer-aided diagnosis. , 2001, Medical physics.
[12] Ivan Bratko,et al. Testing the significance of attribute interactions , 2004, ICML.
[13] Bernhard Sendhoff,et al. How to Determine the Redundancy of Noisy Chaotic Time Series , 1996 .
[14] Ron Kohavi,et al. Wrappers for Feature Subset Selection , 1997, Artif. Intell..
[15] Liam Paninski,et al. Estimation of Entropy and Mutual Information , 2003, Neural Computation.
[16] Huan Liu,et al. Efficient Feature Selection via Analysis of Relevance and Redundancy , 2004, J. Mach. Learn. Res..
[17] Yudong D. He,et al. Gene expression profiling predicts clinical outcome of breast cancer , 2002, Nature.
[18] M. Studený,et al. The Multiinformation Function as a Tool for Measuring Stochastic Dependence , 1998, Learning in Graphical Models.
[19] Gianluca Bontempi,et al. Causal filter selection in microarray data , 2010, ICML.
[20] Naftali Tishby,et al. The information bottleneck method , 2000, ArXiv.
[21] Igor Kononenko,et al. Estimating Attributes: Analysis and Extensions of RELIEF , 1994, ECML.
[22] Ivan Kojadinovic,et al. Relevance measures for subset variable selection in regression problems based on k , 2005, Comput. Stat. Data Anal..
[23] Chris Wiggins,et al. ARACNE: An Algorithm for the Reconstruction of Gene Regulatory Networks in a Mammalian Cellular Context , 2004, BMC Bioinformatics.
[24] Colas Schretter,et al. Information-Theoretic Feature Selection in Microarray Data Using Variable Complementarity , 2008, IEEE Journal of Selected Topics in Signal Processing.
[25] Rich Caruana,et al. Greedy Attribute Selection , 1994, ICML.
[26] Yudong D. He,et al. A Gene-Expression Signature as a Predictor of Survival in Breast Cancer , 2002 .
[27] Daniel Marbach,et al. Information-Theoretic Inference of Gene Networks Using Backward Elimination , 2010, BIOCOMP.
[28] Igor Vajda,et al. Estimation of the Information by an Adaptive Partitioning of the Observation Space , 1999, IEEE Trans. Inf. Theory.
[29] Huan Liu,et al. Incremental Feature Selection , 1998, Applied Intelligence.
[30] W. J. McGill. Multivariate information transmission , 1954, Trans. IRE Prof. Group Inf. Theory.
[31] F. Fleuret. Fast Binary Feature Selection with Conditional Mutual Information , 2004, J. Mach. Learn. Res..
[32] Daphne Koller,et al. Toward Optimal Feature Selection , 1996, ICML.
[33] Kevin Kontos,et al. Information-Theoretic Inference of Large Transcriptional Regulatory Networks , 2007, EURASIP J. Bioinform. Syst. Biol..
[34] Fuhui Long,et al. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy , 2003, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[35] Isabelle Guyon,et al. An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..
[36] Jonathon Shlens,et al. Estimating Entropy Rates with Bayesian Confidence Intervals , 2005, Neural Computation.
[37] Claude E. Shannon,et al. Communication theory of secrecy systems , 1949, Bell Syst. Tech. J..
[38] Michel Verleysen,et al. Mutual information for the selection of relevant variables in spectrometric nonlinear modelling , 2006, ArXiv.
[39] Roberto Battiti,et al. Using mutual information for selecting features in supervised neural net learning , 1994, IEEE Trans. Neural Networks.
[40] David A. Bell,et al. A Formalism for Relevance and Its Application in Feature Subset Selection , 2000, Machine Learning.
[41] Carsten O. Daub,et al. Estimating mutual information using B-spline functions – an improved similarity measure for analysing gene expression data , 2004, BMC Bioinformatics.