Exploring User Satisfaction in a Tutorial Dialogue System

User satisfaction is a common evaluation metric in task-oriented dialogue systems, whereas tutorial dialogue systems are often evaluated in terms of student learning gain. However, user satisfaction is also important for such systems, since it may predict technology acceptance. We present a detailed satisfaction questionnaire used in evaluating the Beetle II system (REVU-NL), and explore the underlying components of user satisfaction using factor analysis. We demonstrate interesting patterns of interaction between interpretation quality, satisfaction and the dialogue policy, highlighting the importance of more fine-grained evaluation of user satisfaction.

[1]  Johanna D. Moore,et al.  Content, Social, and Metacognitive Statements: An Empirical Study Comparing Human-Human and Human-Computer Tutorial Dialogue , 2010, EC-TEL.

[2]  Joel R. Tetreault,et al.  Comparing Synthesized versus Pre-Recorded Tutor Speech in an Intelligent Tutoring Spoken Dialogue System , 2006, FLAIRS.

[3]  A ToddPeter,et al.  Perceived usefulness, ease of use, and usage of information technology , 1992 .

[4]  Diane J. Litman,et al.  Content-Learning Correlations in Spoken Tutoring Dialogs at Word, Turn, and Discourse Levels , 2008, FLAIRS Conference.

[5]  Johanna D. Moore,et al.  The Impact of Interpretation Problems on Tutorial Dialogue , 2010, ACL.

[6]  Diane J. Litman,et al.  Designing and evaluating a wizarded uncertainty-adaptive spoken dialogue tutoring system , 2011, Comput. Speech Lang..

[7]  Kallirroi Georgila,et al.  Reducing working memory load in spoken dialogue systems , 2009, Interact. Comput..

[8]  Michael Glass,et al.  Learning from a Computer Tutor with Natural Language Capabilities , 2003, Interact. Learn. Environ..

[9]  Arthur C. Graesser,et al.  What Students Expect May Have More Impact Than What They Know or Feel , 2009, AIED.

[10]  Marilyn A. Walker,et al.  Towards developing general models of usability with PARADISE , 2000, Natural Language Engineering.

[11]  Lars Bo Larsen,et al.  Issues in the evaluation of spoken dialogue systems using objective and subjective measures , 2003, 2003 IEEE Workshop on Automatic Speech Recognition and Understanding (IEEE Cat. No.03EX721).

[12]  Fred D. Davis Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology , 1989, MIS Q..

[13]  Johanna D. Moore,et al.  Using Natural Language Processing to Analyze Tutorial Dialogue Corpora Across Domains Modalities , 2009, AIED.

[14]  K. Á. T.,et al.  Towards a tool for the Subjective Assessment of Speech System Interfaces (SASSI) , 2000, Natural Language Engineering.

[15]  Johanna D. Moore,et al.  Beetle II: A System for Tutoring and Computational Linguistics Experimentation , 2010, ACL.

[16]  Diane J. Litman,et al.  Adapting to Student Uncertainty Improves Tutoring Dialogues , 2009, AIED.

[17]  Sebastian Möller,et al.  Evaluating spoken dialogue systems according to de-facto standards: A case study , 2007, Comput. Speech Lang..