An experiment in the application of similarity‐based learning to programming by example

Programming by example is a powerful way of bestowing on nonprogrammers the ability to communicate tasks to a computer. When creating procedures from examples it is necessary to be able to infer the existence of variables, conditional branches, and loops. This article explores the role of empirical or “similarity‐based” learning in this process. For a concrete example of a procedure induction system, we use an existing scheme called METAMOUSE which allows graphical procedures to be specified from examples of their execution. A procedure is induced from the first example, and can be generalized in accordance with examples encountered later on. We describe how the system can be enhanced with Mitchell's candidate elimination algorithm, one of the simplest empirical learning techniques, to improve its ability to recognize constraints in a comprehensive and flexible manner. Procedure induction is, no doubt, a very complex task. This work revealed usefulness and effectiveness of empirical learning in procedure induction, although it cannot be a complete substitute for specific preprogrammed, domain knowledge in situations where this is readily available. However, in domains such as graphical editing, where knowledge is incomplete and/or incorrect, the best way to pursue may prove to be a combination of similarity‐ and explanation‐based learning. © 1994 John Wiley & Sons, Inc.

[1]  Jadzia Cendrowska,et al.  PRISM: An Algorithm for Inducing Modular Rules , 1987, Int. J. Man Mach. Stud..

[2]  Ian H. Witten,et al.  A framework for knowledge acquisition through techniques of concept learning , 1989, IEEE Trans. Syst. Man Cybern..

[3]  Peter M. Andreae,et al.  Constraint Limited Generalization: Acquiring Procedures From Examples , 1984, AAAI.

[4]  Ian H. Witten,et al.  Metamouse: specifying graphical procedures by example , 1989, SIGGRAPH.

[5]  Ryszard S. Michalski,et al.  A Theory and Methodology of Inductive Learning , 1983, Artificial Intelligence.

[6]  Carl H. Smith,et al.  Inductive Inference: Theory and Methods , 1983, CSUR.

[7]  Jaime G. Carbonell,et al.  Introduction: Paradigms for Machine Learning , 1989, Artif. Intell..

[8]  Jaime G. Carbonell,et al.  An Overview of Machine Learning , 1983 .

[9]  Peter M. Andreae Justified generalization: acquiring procedures from examples , 1984 .

[10]  Tom M. Mitchell,et al.  Generalization as Search , 2002 .

[11]  Michael R. Genesereth,et al.  Logical foundations of artificial intelligence , 1987 .

[12]  Ian H. Witten,et al.  Learning text editing tasks from examples: a procedural approach , 1992 .

[13]  J. Ross Quinlan,et al.  Learning Efficient Classification Procedures and Their Application to Chess End Games , 1983 .

[14]  Thomas G. Dietterich,et al.  A Comparative Review of Selected Methods for Learning from Examples , 1983 .

[15]  Robert Nix,et al.  Editing by example , 1985, POPL '84.

[16]  E. Mark Gold,et al.  Language Identification in the Limit , 1967, Inf. Control..

[17]  Ian H. Witten,et al.  Using Concept Learning for Knowledge Acquisition , 1988, Int. J. Man Mach. Stud..