GDS: Gradient Descent Generation of Symbolic Classification Rules

Imagine you have designed a neural network that successfully learns a complex classification task. What are the relevant input features the classifier relies on and how are these features combined to produce the classification decisions? There are applications where a deeper insight into the structure of an adaptive system and thus into the underlying classification problem may well be as important as the system's performance characteristics, e.g. in economics or medicine. GDS is a backpropagation-based training scheme that produces networks transformable into an equivalent and concise set of IF-THEN rules. This is achieved by imposing penalty terms on the network parameters that adapt the network to the expressive power of this class of rules. Thus during training we simultaneously minimize classification and transformation error. Some real-world tasks demonstrate the viability of our approach.

[1]  J. Nadal,et al.  Learning in feedforward layered networks: the tiling algorithm , 1989 .

[2]  Georg Schnitger,et al.  On the computational power of sigmoid versus Boolean threshold circuits , 1991, [1991] Proceedings 32nd Annual Symposium of Foundations of Computer Science.

[3]  Jude W. Shavlik,et al.  Training Knowledge-Based Neural Networks to Recognize Genes , 1990, NIPS.

[4]  Volker Tresp,et al.  Network Structuring and Training Using Rule-Based Knowledge , 1992, NIPS.

[5]  Anders Krogh,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[6]  Yoichi Hayashi,et al.  A Neural Expert System with Automated Extraction of Fuzzy If-Then Rules , 1990, NIPS.

[7]  Padhraic Smyth,et al.  An Information Theoretic Approach to Rule-Based Connectionist Expert Systems , 1988, NIPS.

[8]  Rodney M. Goodman,et al.  Incremental learning with rule-based neural networks , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[9]  R. Nakano,et al.  Medical diagnostic expert system based on PDP model , 1988, IEEE 1988 International Conference on Neural Networks.