A new algorithm to automate inductive learning of default theories*

In inductive learning of a broad concept, an algorithm should be able to distinguish concept examples from exceptions and noisy data. An approach through recursively finding patterns in exceptions turns out to correspond to the problem of learning default theories. Default logic is what humans employ in common-sense reasoning. Therefore, learned default theories are better understood by humans. In this paper, we present new algorithms to learn default theories in the form of non-monotonic logic programs. Experiments reported in this paper show that our algorithms are a significant improvement over traditional approaches based on inductive logic programming.

[1]  Proceedings of the Eighth International Workshop (ML91), Northwestern University, Evanston, Illinois, USA , 1991, ICML.

[2]  Stefan Wrobel,et al.  Machine Learning: ECML-95 , 1995, Lecture Notes in Computer Science.

[3]  Jason Catlett,et al.  Megainduction: A Test Flight , 1991, ML.

[4]  Gopal Gupta,et al.  Galliwasp: A Goal-Directed Answer Set Solver , 2012, LOPSTR.

[5]  Tom M. Mitchell,et al.  The Need for Biases in Learning Generalizations , 2007 .

[6]  Gordon Plotkin,et al.  A Further Note on Inductive Generalization , 2008 .

[7]  Stephen Muggleton,et al.  Inverse entailment and progol , 1995, New Generation Computing.

[8]  Elvira Albert,et al.  Logic-Based Program Synthesis and Transformation : 22nd International Symposium, LOPSTR 2012, Leuven, Belgium, September 18-20, 2012, Revised Selected Papers , 2013 .

[9]  Philip S. Yu,et al.  Top 10 algorithms in data mining , 2007, Knowledge and Information Systems.

[10]  Chiaki Sakama,et al.  Induction from answer sets in nonmonotonic logic programs , 2005, TOCL.

[11]  Stephen Muggleton,et al.  Inductive Logic Programming , 2011, Lecture Notes in Computer Science.

[12]  Oliver Ray,et al.  Nonmonotonic abductive inductive learning , 2009, J. Appl. Log..

[13]  Jennifer Seitzer,et al.  Stable ILP : Exploring the Added Expressivity of Negation in the Background Knowledge , 1997 .

[14]  Michael Bain,et al.  Experiments in Non-Monotonic Learning , 1991, ML.

[15]  Robert A. Kowalski,et al.  Logic Programming, Proceedings of the Fifth International Conference and Symposium, Seattle, Washington, USA, August 15-19, 1988 (2 Volumes) , 1988, ICLP/SPL.

[16]  Tom Schrijvers,et al.  Under Consideration for Publication in Theory and Practice of Logic Programming Swi-prolog , 2022 .

[17]  Stefan Wrobel,et al.  Machine Learning: ECML-95: 8th European Conference on Machine Learning, Heraclion, Crete, Greece, April 25 - 27, 1995. Proceedings , 1995 .

[18]  J. Ross Quinlan,et al.  C4.5: Programs for Machine Learning , 1992 .

[19]  Machine Learning, Proceedings of the Fifth International Conference on Machine Learning, Ann Arbor, Michigan, USA, June 12-14, 1988 , 1988, ICML.

[20]  Béatrice Duval,et al.  Representation of Incomplete Knowledge by Induction of Default Theories , 2001, LPNMR.

[21]  Alex M. Andrew,et al.  Knowledge Representation, Reasoning and Declarative Problem Solving , 2004 .

[22]  Thomas G. Dietterich What is machine learning? , 2020, Archives of Disease in Childhood.

[23]  Ashwin Srinivasan,et al.  Distinguishing Exceptions From Noise in Non-Monotonic Learning , 1992 .

[24]  Alessandra Russo,et al.  Inductive Logic Programming in Answer Set Programming , 2011, ILP.

[25]  Antonis C. Kakas,et al.  Learning Non-Monotonic Logic Programs: Learning Exceptions , 1995, ECML.

[26]  J. Ross Quinlan,et al.  Learning logical definitions from relations , 1990, Machine Learning.

[27]  Stephen Muggleton,et al.  Machine Invention of First Order Predicates by Inverting Resolution , 1988, ML.

[28]  S. Džeroski,et al.  Relational Data Mining , 2001, Springer Berlin Heidelberg.

[29]  Manuel V. Hermenegildo,et al.  Energy Consumption Analysis of Programs Based on XMOS ISA-Level Models , 2013, LOPSTR.