Knowledge-based neural networks are networks whose topology is determined by mapping the dependencies of a domain-speciic rulebase into a neural network. However, existing network training methods lack the ability to add new rules to the (reformulated) rulebases. Thus, on domain theories that are lacking rules, generalization is poor, and training can corrupt the original rules, even those that were initially correct. We present TopGen, an extension to the Kbann algorithm, that heuristically searches for possible expansions of a knowledge-based neural network, guided by the domain theory, the network, and the training data. It does this by dynamically adding hidden nodes to the neural representation of the domain theory, in a manner analogous to adding rules and con-juncts to the symbolic rulebase. Experiments indicate that our method is able to heuristi-cally nd eeective places to add nodes to the knowledge-base network and verify that new nodes must be added in an intelligent manner.
[1]
Geoffrey E. Hinton,et al.
Learning internal representations by error propagation
,
1986
.
[2]
Geoffrey E. Hinton,et al.
Learning distributed representations of concepts.
,
1989
.
[3]
Allen Ginsberg,et al.
Theory Reduction, Theory Revision, and Retranslation
,
1990,
AAAI.
[4]
J. Shavlik,et al.
Re nement of Approximate Domain Theories byKnowledge-Based Neural Networks
,
1990
.
[5]
Marcus Frean,et al.
The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural Networks
,
1990,
Neural Computation.
[6]
Geoffrey G. Towell,et al.
Symbolic knowledge and neural networks: insertion, refinement and extraction
,
1992
.
[7]
Jude W. Shavlik,et al.
Using Symbolic Learning to Improve Knowledge-Based Neural Networks
,
1992,
AAAI.
[8]
Jude W. Shavlik,et al.
Machine learning approaches to gene recognition
,
1994,
IEEE Expert.