Computation of two-layer perceptron networks’ sensitivity to input perturbation
暂无分享,去创建一个
[1] Daniel S. Yeung,et al. Hidden neuron pruning of multilayer perceptrons using a quantified sensitivity measure , 2006, Neurocomputing.
[2] Vincenzo Piuri,et al. Sensitivity to errors in artificial neural networks: a behavioral approach , 1995 .
[3] Daniel S. Yeung,et al. A Quantified Sensitivity Measure for Multilayer Perceptron to Input Perturbation , 2003, Neural Computation.
[4] Steve W. Piche,et al. The selection of weight accuracies for Madalines , 1995, IEEE Trans. Neural Networks.
[5] Kang Zhang,et al. Computation of Adalines' sensitivity to weight perturbation , 2006, IEEE Transactions on Neural Networks.
[6] Daniel S. Yeung,et al. Using function approximation to analyze the sensitivity of MLP with antisymmetric squashing activation function , 2002, IEEE Trans. Neural Networks.
[7] W.W.Y. Ng,et al. Statistical Sensitivity Measure of Single Layer Perceptron Neural Networks to Input Perturbation , 2006, 2006 International Conference on Machine Learning and Cybernetics.
[8] Daniel S. Yeung,et al. Computation of Madalines' Sensitivity to Input and Weight Perturbations , 2006, Neural Computation.
[9] Wing W. Y. Ng,et al. Selection of weight quantisation accuracy for radial basis function neural network using stochastic sensitivity measure , 2003 .
[10] Bernard Widrow,et al. Sensitivity of feedforward neural networks to weight errors , 1990, IEEE Trans. Neural Networks.
[11] Daniel S. Yeung,et al. Sensitivity analysis of neocognitron , 1999, IEEE Trans. Syst. Man Cybern. Part C.
[12] W. Gander,et al. Adaptive Quadrature—Revisited , 2000 .
[13] Daniel S. Yeung,et al. Sensitivity analysis of multilayer perceptron to input and weight perturbations , 2001, IEEE Trans. Neural Networks.