Advanced explanation capabilities for intelligent tutoring systems: The explanation structure model (EXSEL)

This paper introduces an advanced explanation capability for the intelligent tutoring system (ITS) to assist students in understanding objects such as electric circuits designed by humans. In general, an object can be explained in various ways. To advance the explanation capability, it is necessary to clarify how these explanations are used properly according to some tutoring goals in the domain. Moreover, the design of ITS based on such a capability requires: (1) classification of explanations based on the tutoring goals, and (2) development of a facility to generate the classified explanations. First, this paper defines object understanding by using two concepts, i.e., viewpoint and object model, and classifies explanations by regarding the defined object understanding as a tutoring goal. Next, the explanation facility, Explanation Structure model (EXSEL) is proposed. EXSEL has a mechanism to generate an explanation structure that is regarded as a resource of explanation and is designed in due consideration of being applied to ITS. The definition of object understanding, the classification of explanations and EXSEL are applicable to the objects explainable at three abstraction levels: structure, behavior, and function. Providing the mechanisms to operate EXSEL according to tutoring requisites such as the student's understanding, it is possible to realize various ITSs to support greatly object understanding.