Extracting Both MofN Rules and if-then Rules from the Training Neural Networks
暂无分享,去创建一个
Artificial Neural Networks classifiers have many advantages such as: noise tolerance, possibility of parallelization, better training with a small quantity of data . . . . Coupling neural networks with an explanation component will increase its usage for those applications. The explanation capacity of neural networks is solved by extracting knowledge incorporated in the trained network [Andrews et al., 1995]. We consider a single neuron (or perceptron) with Heaviside map as activation function (f(x) = 0 if x < 0 else 1). For a given perceptron with the connection weights vector W and the threshold θ, this means finding the different states where the neuron is active (wich could be reduced to the Knapsack problem. With the existing algorithms, two forms of rules are reported in the literature : ’If (condition) then conclusion’ form noted ’if then’ rules, ’If (m of a set of conditions) then conclusion’ form noted ’MofN ’ The intermediate structures that we introduce are called MaxSubset list and generator list. The MaxSubset is a minimal structure used to represent the if-then rules while the generator list is some selected MaxSubsets from which we can derive all MaxSubsets and all MofN rules. We introduce heuristics to prune and reduce the candidate search space. These heuristics consist of sorting the incoming links according to the descending order, and then pruning the search space using the subset cardinality bounded by some determined values.
[1] Joachim Diederich,et al. Survey and critique of techniques for extracting rules from trained artificial neural networks , 1995, Knowl. Based Syst..
[2] Engelbert Mephu Nguifo,et al. Towards a generalization of decompositional approach of rule extraction from multilayer artificial neural network , 2011, The 2011 International Joint Conference on Neural Networks.