Generating rules from trained network using fast pruning

Before symbolic rules are extracted from a trained neural network, the network is usually pruned so as to obtain more concise rules. Typical pruning algorithms require retraining the network which incurs additional cost. This paper presents FERNN, a fast method for extracting rules from trained neural networks without network re-training. Given a fully connected trained feedforward network, FERNN first identifies the relevant hidden units by computing their information gains. Next, it identifies relevant connections from the input units to the relevant hidden units by checking the magnitudes of their weights. Finally, FERNN generates rules based on the relevant hidden units and weights. Our experimental results show that the size and accuracy of the tree generated are comparable to those extracted by another method which prunes and retrains the network.