iDropout: Leveraging Deep Taylor Decomposition for the Robustness of Deep Neural Networks

In this work, we present iDropout, a new method to adjust dropout, from purely randomly dropping inputs to dropping inputs based on a mix based on the relevance of the nodes and some randomness. We use Deep Taylor Decomposition to calculate the respective relevance of the inputs and based on this, we give input nodes with a higher relevance a higher probability to be included than input nodes that seem to have less of an impact. The proposed method does not only seem to increase the performance of a Neural Network, but it also seems to make the network more robust to missing data. We evaluated the approach on artificial data with various settings, e.g. noise in data, number of informative features and on real-world datasets from the UCI Machine Learning Repository.

[1]  Yann LeCun,et al.  Regularization of Neural Networks using DropConnect , 2013, ICML.

[2]  Lynne E. Parker,et al.  A spatial-temporal imputation technique for classification with missing data in a wireless sensor network , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Alexander Binder,et al.  Explaining nonlinear classification decisions with deep Taylor decomposition , 2015, Pattern Recognit..

[4]  Davide Morelli,et al.  DropIn: Making reservoir computing neural networks robust to missing inputs by dropout , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).

[5]  Ninni Singh,et al.  Missing Value Imputation with Unsupervised Kohonen Self Organizing Map , 2015 .

[6]  Michel Verleysen,et al.  K nearest neighbours with mutual information for simultaneous classification and missing data imputation , 2009, Neurocomputing.

[7]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[8]  Quan Pan,et al.  Adaptive imputation of missing values for incomplete pattern classification , 2016, Pattern Recognit..

[9]  Phil D. Green,et al.  A neural network for classification with incomplete data: application to robust ASR , 2000, INTERSPEECH.

[10]  Xiao Wang,et al.  Defensive dropout for hardening deep neural networks under adversarial attacks , 2018, ICCAD.

[11]  Brendan J. Frey,et al.  Adaptive dropout for training deep neural networks , 2013, NIPS.

[12]  Winston Khoon Guan Seah,et al.  Reliability in wireless sensor networks: A survey and challenges ahead , 2015, Comput. Networks.

[13]  Davide Bacciu,et al.  Augmenting Recurrent Neural Networks Resilience by Dropout , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[14]  Lynne E. Parker,et al.  Nearest neighbor imputation using spatial-temporal correlations in wireless sensor networks , 2014, Inf. Fusion.

[15]  A. Sumathi,et al.  Missing value imputation techniques depth survey and an imputation Algorithm to improve the efficiency of imputation , 2012, 2012 Fourth International Conference on Advanced Computing (ICoAC).

[16]  Wojciech Samek,et al.  Methods for interpreting and understanding deep neural networks , 2017, Digit. Signal Process..