Combining Cognitive and Affective Measures with Epistemic Planning for Explanation Generation

This paper presents an overview of the EPSRC-funded project Start Making Sense, which is investigating explainability and trust maintenance in interactive and autonomous systems. This project brings together experimental research in cognitive science involving cooperative joint action with the practical construction of automated planning tools to apply to the task of explanation generation. The project’s challenges are addressed through three concrete objectives: (i) to study cooperative joint action in humans to identify the emotional, affective, or cognitive factors that are essential for successful human communication, (ii) to enhance epistemic planning techniques with measures derived from the studies for improved human-like explanation generation, and (iii) to deploy and evaluate the resulting system with human participants. We also describe initial work from the cognitive side of the project aimed at exploring how ambiguity, uncertainty, and certain types of biometric measurements impact instruction giving and explanation actions in scenarios with humans. The insights from this work will be combined with epistemic planning techniques to generate appropriate explanatory actions in similar instruction giving scenarios.

[1]  Tim Miller,et al.  Explanation in Artificial Intelligence: Insights from the Social Sciences , 2017, Artif. Intell..

[2]  Fahiem Bacchus,et al.  Extending the Knowledge-Based Approach to Planning with Incomplete Information and Sensing , 2004, ICAPS.

[3]  Maria Fox,et al.  Explainable Planning , 2017, ArXiv.

[4]  Ronald P. A. Petrick,et al.  Planning Dialog Actions , 2007, SIGDIAL.

[5]  Thomas Bolander,et al.  A Gentle Introduction to Epistemic Planning: The DEL Approach , 2017, M4M@ICLA.

[6]  Matthew Henderson,et al.  Recovering from Non-Understanding Errors in a Conversational Dialogue System , 2012 .

[7]  Subbarao Kambhampati,et al.  Balancing Explicability and Explanation in Human-Aware Planning , 2017, AAAI Fall Symposia.

[8]  Ronald Fagin,et al.  Reasoning about knowledge , 1995 .

[9]  Anne H. Anderson,et al.  The Hcrc Map Task Corpus , 1991 .

[10]  Khalil Sima'an,et al.  Wired for Speech: How Voice Activates and Advances the Human-Computer Relationship , 2006, Computational Linguistics.

[11]  Jorge A. Baier,et al.  Preferred Explanations: Theory and Generation via Planning , 2011, AAAI.

[12]  Seth Flaxman,et al.  European Union Regulations on Algorithmic Decision-Making and a "Right to Explanation" , 2016, AI Mag..

[13]  Robin L. Hill,et al.  Click or Type: An Analysis of Wizard's Interaction for Future Wizard Interface Design , 2014, DM@EACL.

[14]  Ronald P. A. Petrick,et al.  Planning for Social Interaction in a Robot Bartender Domain , 2013, ICAPS.

[15]  Manuel Giuliani,et al.  Ghost-in-the-Machine reveals human social signals for human–robot interaction , 2015, Front. Psychol..

[16]  Yu Zhang,et al.  Plan Explanations as Model Reconciliation: Moving Beyond Explanation as Soliloquy , 2017, IJCAI.

[17]  Susanne Biundo-Stephan,et al.  Making Hybrid Plans More Clear to Human Users - A Formal Approach for Generating Sound Explanations , 2012, ICAPS.

[18]  Amy Isard,et al.  Evaluating Description and Reference Strategies in a Cooperative Human-Robot Dialogue System , 2009, IJCAI.

[19]  Fahiem Bacchus,et al.  A Knowledge-Based Approach to Planning with Incomplete Information and Sensing , 2002, AIPS.

[20]  Jean Carletta,et al.  Eyetracking for two-person tasks with manipulation of a virtual world , 2010, Behavior research methods.

[21]  Paolo Traverso,et al.  Automated planning - theory and practice , 2004 .