Multimodal Speech-based Dialogue for the Mini-Mental State Examination

We present a system-initiative multimodal speech-based dialogue system for the Mini-Mental State Examination (MMSE). The MMSE is a questionnaire-based cognitive test, which is traditionally administered by a trained expert using pen and paper and afterwards scored manually to measure cognitive impairment. By using a digital pen and speech dialogue, we implement a multimodal system for the automatic execution and evaluation of the MMSE. User input is evaluated and scored in real-time. We present a user experience study with 15 participants and compare the usability of the proposed system with the traditional approach. Our experiment suggests that both modes perform equally well in terms of usability, but the proposed system has higher novelty ratings. We compare assessment scorings produced by our system with manual scorings made by domain experts.

[1]  Martin Schrepp,et al.  Applying the User Experience Questionnaire (UEQ) in Different Evaluation Scenarios , 2014, HCI.

[2]  Daniel Sonntag,et al.  Towards a Multimodal Multisensory Cognitive Assessment Framework , 2018, 2018 IEEE 31st International Symposium on Computer-Based Medical Systems (CBMS).

[3]  J. B. Brooke,et al.  SUS: a retrospective , 2013 .

[4]  Fernando De la Torre,et al.  Detecting depression from facial actions and vocal prosody , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[5]  D. Marson,et al.  The Severe Mini-Mental State Examination: A New Neuropsychologic Instrument for the Bedside Assessment of Severely Impaired Patients With Alzheimer Disease , 2000, Alzheimer disease and associated disorders.

[6]  Daniel Sonntag,et al.  A categorisation and implementation of digital pen features for behaviour characterisation , 2018, ArXiv.

[7]  G. Rao,et al.  The Clock Drawing Test versus Mini-mental Status Examination as a Screening Tool for Dementia: A Clinical Comparison , 2018, Indian journal of psychological medicine.

[8]  S. Folstein,et al.  "Mini-mental state". A practical method for grading the cognitive state of patients for the clinician. , 1975, Journal of psychiatric research.

[9]  Randall Davis,et al.  THink: Inferring Cognitive Status from Subtle Behaviors , 2014, AI Mag..

[10]  Antonio Krüger,et al.  Introduction: scope, trends, and paradigm shift in the field of computer interfaces , 2017 .

[11]  Björn W. Schuller,et al.  Recent developments in openSMILE, the munich open-source multimedia feature extractor , 2013, ACM Multimedia.

[12]  Louis-Philippe Morency,et al.  OpenFace 2.0: Facial Behavior Analysis Toolkit , 2018, 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018).

[13]  Daniel Sonntag,et al.  Interakt - A Multimodal Multisensory Interactive Cognitive Assessment Tool , 2017, ArXiv.