CancelOut: A Layer for Feature Selection in Deep Neural Networks
暂无分享,去创建一个
Gjergji Kasneci | Vadim Borisov | Johannes Haug | Gjergji Kasneci | V. Borisov | Johannes Haug | G. Kasneci
[1] Wyeth W. Wasserman,et al. Deep Feature Selection: Theory and Application to Identify Enhancers and Promoters , 2015, RECOMB.
[2] Gjergji Kasneci,et al. LICON: A Linear Weighting Scheme for the Contribution ofInput Variables in Deep Artificial Neural Networks , 2016, CIKM.
[3] Lalana Kagal,et al. Explaining Explanations: An Overview of Interpretability of Machine Learning , 2018, 2018 IEEE 5th International Conference on Data Science and Advanced Analytics (DSAA).
[4] Scott Lundberg,et al. A Unified Approach to Interpreting Model Predictions , 2017, NIPS.
[5] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[6] Anna Goldenberg,et al. Dropout Feature Ranking for Deep Learning Models , 2017, ArXiv.
[7] George Cybenko,et al. Approximation by superpositions of a sigmoidal function , 1992, Math. Control. Signals Syst..
[8] Geoffrey E. Hinton,et al. Learning internal representations by error propagation , 1986 .
[9] Quanshi Zhang,et al. Interpretable Convolutional Neural Networks , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.