On the Feature Selection Criterion Based on an Approximation of Multidimensional Mutual Information
暂无分享,去创建一个
[1] Roberto Battiti,et al. Using mutual information for selecting features in supervised neural net learning , 1994, IEEE Trans. Neural Networks.
[2] J. Moody,et al. Feature Selection Based on Joint Mutual Information , 1999 .
[3] Martin E. Hellman,et al. Probability of error, equivocation, and the Chernoff bound , 1970, IEEE Trans. Inf. Theory.
[4] Fuhui Long,et al. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy , 2003, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[5] Alfred O. Hero,et al. Applications of entropic spanning graphs , 2002, IEEE Signal Process. Mag..
[6] L. Györfi,et al. Nonparametric entropy estimation. An overview , 1997 .
[7] Alfred O. Hero,et al. Image matching using alpha-entropy measures and entropic graphs , 2005, Signal Process..
[8] Miguel Cazorla,et al. Feature selection, mutual information, and the classification of high-dimensional patterns , 2008, Pattern Analysis and Applications.
[9] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[10] Chong-Ho Choi,et al. Input Feature Selection by Mutual Information Based on Parzen Window , 2002, IEEE Trans. Pattern Anal. Mach. Intell..
[11] Chong-Ho Choi,et al. Input feature selection for classification problems , 2002, IEEE Trans. Neural Networks.
[12] Philip M. Lewis,et al. The characteristic selection problem in recognition systems , 1962, IRE Trans. Inf. Theory.