VibRein: An Engaging and Assistive Mobile Learning Companion for Students with Intellectual Disabilities

Massive Open Online Courses (MOOCs) have paved a new wave in the education world. Rich multimedia content coupled with mobile delivery mechanisms makes the content always available and engaging. This paper proposes VibRein to enrich the student interaction with multimedia learning content by making use of different sensors that are available on a mobile device to create an intelligent video consumption experience. VibRein as a companion is even more effective for students with intellectual disabilities who require some form of continuous supervision. It provides an assistive mechanism that keeps track of the user attention using the device camera (this can be particularly useful for students with attention disorder), and uses haptic feedback to recapture attention. In course of the video consumption, VibRein evaluates the learning by asking questions about the content in the video, and automatically force-rewinds to the location where the concept was explained if the user answers incorrectly. It uses tilt in four directions for response to questions, since touch, as a modality on mobile devices requires fine motor skills. An evaluation with 18 users with intellectual disabilities of various kind (autism, intellectual disability and attention deficit hyperactive disorder) suggests that VibRein can provide better learning with less intervention.

[1]  L. Wing,et al.  Symbolic play in severely mentally retarded and in autistic children. , 1977, Journal of child psychology and psychiatry, and allied disciplines.

[2]  O. I. Lovaas,et al.  Behavioral treatment and normal educational and intellectual functioning in young autistic children. , 1987, Journal of consulting and clinical psychology.

[3]  Timothy R. Carter,et al.  Implications for policy makers and planners , 1988 .

[4]  Kerstin Dautenhahn,et al.  Applying Mobile Robot Technology to the Rehabilitation of Autistic Children , 1999 .

[5]  Karon E. MacLean,et al.  Haptic techniques for media control , 2001, UIST '01.

[6]  Wolfgang Hürst,et al.  Interfaces for timeline-based mobile video browsing , 2008, ACM Multimedia.

[7]  Nicu Sebe,et al.  Facial expression recognition as a creative interface , 2008, IUI '08.

[8]  Jan O. Borchers,et al.  PocketDRAGON: a direct manipulation video navigation interface for mobile devices , 2009, Mobile HCI.

[9]  Raimund Dachselt,et al.  Natural throw and tilt interaction between mobile phones and distant displays , 2009, CHI Extended Abstracts.

[10]  Mira Dontcheva,et al.  Pause-and-play: automatically linking screencast video tutorials with applications , 2011, UIST.

[11]  Hao Jiang,et al.  Capturing user reading behaviors for personalized document summarization , 2011, IUI '11.

[12]  Gang Pan,et al.  Tilt & touch: mobile phone for 3D interaction , 2011, UbiComp '11.

[13]  Sidney K. D'Mello,et al.  Detecting boredom and engagement during writing with keystroke analysis, task appraisals, and stable traits , 2013, IUI '13.

[14]  John Adcock,et al.  Real-time direct manipulation of screen-based videos , 2013, IUI '13 Companion.

[15]  Anders Bouwer,et al.  Vibrobelt: tactile navigation support for cyclists , 2013, IUI '13.

[16]  Paul Dalziel,et al.  Education and Skills , 2013 .

[17]  Eelke Folmer,et al.  Haptic interface for non-visual steering , 2013, IUI '13.

[18]  Hana Vrzakova,et al.  Computational approaches to visual attention for interaction inference , 2013, IUI '13 Companion.