Control of Hypothesis Space Using Meta-knowledge in Inductive Learning
暂无分享,去创建一个
Inductive logic programming (ILP) is effective for classification learning because it constructs hypotheses combining background knowledge. On the other hand it makes the cost of search for hypothesis large. This paper proposes a method to prune hypothesis using a kind of semantic knowledge. When an ILP system uses a top-down search, after it visits a clause (rule) it explore another clause by adding a condition. The added condition may be redundant with other conditions in the clause or the condition may causes the body of clause unsatisfied. We study to represent and use to treat the redundancy and unsatisfactory of conditions as meta-knowledge of predicates. In this paper we give a formalism of meta-knowledge and show to use it with an ILP algorithm. We also study a method to generate meta-knowledge automatically. The method generates meta-knowledge which controls redundancy and contradiction with respect to predicates by testing properties extensionally.
[1] J. Ross Quinlan,et al. Learning logical definitions from relations , 1990, Machine Learning.
[2] Luc De Raedt,et al. Mining Association Rules in Multiple Relations , 1997, ILP.
[3] Stephen Muggleton,et al. Inverse entailment and progol , 1995, New Generation Computing.
[4] R. Mike Cameron-Jones,et al. FOIL: A Midterm Report , 1993, ECML.
[5] Stephen Muggleton. Inverting Entailment and Progol , 1993, Machine Intelligence 14.