Aggregation of Qualitative Simulations for Explanation

Qualitative simulations can be seen as knowledge models that capture insights about system behaviour that should be acquired by learners. A problem that learners encounter when interacting with qualitative simulations is the overwhelming amount of knowledge detail represented in such models. As a result, the discovery space grows too large, which hampers the knowledge construction process of the learner. In this paper we present an approach to restructure the output of a qualitative reasoning engine in order to make it better suited for use in interactive learning environments. The approach combines techniques for simplifying state-graphs with techniques for aggregating causal models within states. The result is an approach that automatically highlights the main behavioural facts in terms of simulation events and simplified causal accounts, while leaving the option for the learner to explore the aggregated constructs in more detail.