Simultaneous Learning and Prediction

Agents in real-world environments may have only partial access to available information, often in an arbitrary, or hard to model, manner. By reasoning with knowledge at their disposal, agents may hope to recover some missing information. By acquiring the knowledge through a process of learning, the agents may further hope to guarantee that the recovered information is indeed correct. Assuming only a black-box access to a learning process and a prediction process that are able to cope with missing information in some principled manner, we examine how the two processes should interact so that they improve their overall joint performance. We identify natural scenarios under which the interleaving of the processes is provably beneficial over their independent use.

[1]  Thomas G. Dietterich Learning and Reasoning , 2003 .

[2]  Raymond Reiter,et al.  A Logic for Default Reasoning , 1987, Artif. Intell..

[3]  Dale Schuurmans,et al.  Learning Default Concepts , 1994 .

[4]  Leslie G. Valiant,et al.  Robust logics , 1999, STOC '99.

[5]  Robert E. Schapire,et al.  The strength of weak learnability , 1990, Mach. Learn..

[6]  Ming Li,et al.  Learning in the presence of malicious errors , 1993, STOC '88.

[7]  L. Valiant Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World , 2013 .

[8]  Ido Dagan,et al.  The Third PASCAL Recognizing Textual Entailment Challenge , 2007, ACL-PASCAL@ACL.

[9]  Stephen Muggleton,et al.  Inductive Logic Programming , 2011, Lecture Notes in Computer Science.

[10]  Geoff Holmes,et al.  Classifier chains for multi-label classification , 2009, Machine Learning.

[11]  Leslie G. Valiant,et al.  A First Experimental Demonstration of Massive Knowledge Infusion , 2008, KR.

[12]  Antonis C. Kakas,et al.  Modular-έ and the role of elaboration tolerance in solving the qualification problem , 2011, Artif. Intell..

[13]  Leslie G. Valiant,et al.  Autodidactic learning and reasoning , 2008 .

[14]  R.L. Rivest,et al.  A Formal Model of Hierarchical Concept Learning , 1994, Inf. Comput..

[15]  Donald Ervin Knuth,et al.  The Art of Computer Programming , 1968 .

[16]  Loizos Michael Partial observability and learnability , 2010, Artif. Intell..

[17]  Thomas G. Dietterich Multiple Classifier Systems , 2000, Lecture Notes in Computer Science.

[18]  Loizos Michael,et al.  Causal Learnability , 2011, IJCAI.

[19]  Umesh V. Vazirani,et al.  An Introduction to Computational Learning Theory , 1994 .

[20]  Leslie G. Valiant,et al.  A general lower bound on the number of examples needed for learning , 1988, COLT '88.

[21]  Eric Brill,et al.  A corpus-based approach to language learning , 1993 .

[22]  Annabel M. Patterson Reading Between the Lines , 1992 .

[23]  Thomas G. Dietterich What is machine learning? , 2020, Archives of Disease in Childhood.

[24]  Brendan Juba,et al.  Implicit Learning of Common Sense for Reasoning , 2013, IJCAI.

[25]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, CACM.

[26]  Thomas G. Dietterich,et al.  Learning Rules from Incomplete Examples via Observation Models , 2011 .

[27]  Leslie G. Valiant,et al.  Knowledge Infusion , 2006, AAAI.

[28]  M. Eisenstein Reading between the lines , 2009, Nature Methods.

[29]  Loizos Michael Missing Information Impediments to Learnability , 2011, COLT.

[30]  Donald E. Knuth,et al.  The Art of Computer Programming: Combinatorial Algorithms, Part 1 , 2011 .