Are Substitutions the Better Examples? Learning complete Set of Clauses with Frog.

The paper presents an approach for machine learning in a restricted first-order language with finite minimal Herbrand models by means of a search through a propositional representation space. The learning target is to find a set of goal clauses which can be used to define a target predicate. That is, we deal with single-predicate learning. For the search process we use the learning algorithm JoJo/ Frog which provides a flexible search strategy. The transition from the first-order representation to the representation in propositional logic is achieved by ground substitutions which transform clauses into ground clauses. Taking a closer look at this transition makes clear that the sufficiency condition which is used by algorithms like FOIL and LINUS as a criterion for judging the achieved learning results does not correspond to the completeness condition in the propositional case. Therefore, we use an extended completeness condition which captures all information given by the example knowledge. As a consequence we get a new definition of positive and negative examples. Instead of ground facts we regard ground substitutions as examples.

[1]  Nada Lavrac,et al.  The Multi-Purpose Incremental Learning System AQ15 and Its Testing Application to Three Medical Domains , 1986, AAAI.

[2]  L. D. Raedt Interactive theory revision: an inductive logic programming approach , 1992 .

[3]  Tom M. Mitchell,et al.  Generalization as Search , 2002 .

[4]  Luc De Raedt,et al.  A Theory of Clausal Discovery , 1993, IJCAI.

[5]  Nicolas Helft,et al.  Induction as Nonmonotonic Inference , 1989, KR.

[6]  Saso Dzeroski,et al.  Learning Nonrecursive Definitions of Relations with LINUS , 1991, EWSL.

[7]  Saso Dzeroski,et al.  Inductive Learning in Deductive Databases , 1993, IEEE Trans. Knowl. Data Eng..

[8]  Dieter Fensel,et al.  Refinement of Rule Sets with JoJo , 1993, ECML.

[9]  Dieter Fensel JoJo: Integration Of Generalization And Specialization , 1993 .

[10]  J. Ross Quinlan,et al.  Learning Efficient Classification Procedures and Their Application to Chess End Games , 1983 .

[11]  Rudolf Wille Knowledge acquisition by methods of formal concept analysis , 1989 .

[12]  R. Mike Cameron-Jones,et al.  FOIL: A Midterm Report , 1993, ECML.

[13]  Stephen Muggleton,et al.  Efficient Induction of Logic Programs , 1990, ALT.

[14]  Saso Dzeroski,et al.  Weakening the language bias in LINUS , 1994, J. Exp. Theor. Artif. Intell..

[15]  J. Lloyd Foundations of Logic Programming , 1984, Symbolic Computation.

[16]  Stephen Muggleton,et al.  Machine Invention of First Order Predicates by Inverting Resolution , 1988, ML.

[17]  Luc De Raedt,et al.  Inductive Logic Programming: Theory and Methods , 1994, J. Log. Program..

[18]  Keith L. Clark,et al.  Negation as Failure , 1987, Logic and Data Bases.

[19]  Francesco Bergadano,et al.  Inductive Database Relations , 1993, IEEE Trans. Knowl. Data Eng..

[20]  Luc De Raedt,et al.  Multiple Predicate Learning , 1993, IJCAI.