A quality framework for multimodal interaction in educational environments

Designing gesture-based collaborative and interactive systems using new technologies has shown noticeable potential for educational purposes and could enhance the memorability of digital contents. In this paper, we present a quality framework and a new combination of modalities to recognise different types of human gestures. The outcome can be used in future classroom designs by replacing expensive smartboards which are physically limited to a fixed place. We also propose a framework based on Norman's theory of action for interactive collaborative systems. This framework has been tested by a combination of Microsoft Kinect and a smartphone's built-in gyroscope together with two types of interfaces in an educational environment. The results proved the improvement of participant's interactivity and participation by changes in five pedagogical factors in different groups of the study and specifically in higher ages.

[1]  Shogo Nishida,et al.  Toe Input Using a Mobile Projector and Kinect Sensor , 2012, 2012 16th International Symposium on Wearable Computers.

[2]  Matthew Turk,et al.  Multimodal interaction: A review , 2014, Pattern Recognit. Lett..

[3]  Anup Basu,et al.  Motion Tracking with an Active Camera , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Albrecht Schmidt,et al.  Mobile Interaction with the Real World: An Evaluation and Comparison of Physical Mobile Interaction Techniques , 2007, AmI.

[5]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.

[6]  Burkhard Wünsche,et al.  Using the Kinect as a navigation sensor for mobile robotics , 2012, IVCNZ '12.

[7]  Zhengyou Zhang,et al.  Microsoft Kinect Sensor and Its Effect , 2012, IEEE Multim..

[8]  Michael Gardiner,et al.  Engaging students through multimodal learning environments: The journey continues , 2010 .

[9]  Hui-mei Justina Hsu The Potential of Kinect in Education , 2011 .

[10]  Dietmar Bauer,et al.  Hands-Free Navigation in Immersive Environments for the Evaluation of the Effectiveness of Indoor Navigation Systems , 2012, J. Virtual Real. Broadcast..

[11]  Slava Kalyuga,et al.  Enhancing Instructional Efficiency of Interactive E-learning Environments: A Cognitive Load Perspective , 2007 .

[12]  Roderick Murray-Smith,et al.  Virtual sensors: rapid prototyping of ubiquitous interaction with a mobile phone and a Kinect , 2011, Mobile HCI.

[13]  D. Massaro Multimodal Learning , 2012 .

[14]  Chuan Qin,et al.  Can smartphone sensors enhance kinect experience? , 2012, MobiHoc '12.

[15]  S. Kopf,et al.  Improving Activity and Motivation of Students with Innovative Teaching and Learning Technologies , 2005 .

[16]  P. Dillenbourg,et al.  NEGOTIATION SPACES IN HUMAN-COMPUTER COLLABORATIVE LEARNING , 1996 .

[17]  Andrew T. Campbell,et al.  Community-Guided Learning: Exploiting Mobile Sensor Users to Model Human Behavior , 2010, AAAI.

[18]  Eric Horvitz,et al.  Foreground and background interaction with sensor-enhanced mobile devices , 2005, TCHI.

[19]  John P Buckley,et al.  Physiologic responses and energy expenditure of kinect active video game play in schoolchildren. , 2012, Archives of pediatrics & adolescent medicine.

[20]  Pablo Bustos,et al.  Model-Based Reinforcement of Kinect Depth Data for Human Motion Capture Applications , 2013, Sensors.

[21]  A. Murat Tekalp,et al.  On the Tracking of Articulated and Occluded Video Object Motion , 2001, Real Time Imaging.

[22]  Albrecht Schmidt,et al.  An Experimental Comparison of Physical Mobile Interaction Techniques: Touching, Pointing and Scanning , 2006, UbiComp.

[23]  Niels Henze,et al.  Gesture recognition with a Wii controller , 2008, TEI.

[24]  Hong Wu,et al.  BAYSCOPE: Evaluating Interaction Techniques for Large Wall-Sized Display , 2012 .

[25]  Tom Rodden,et al.  Exploiting Context in HCI Design for Mobile Systems , 1998 .

[26]  Domenico Prattichizzo,et al.  Using Kinect for hand tracking and rendering in wearable haptics , 2011, 2011 IEEE World Haptics Conference.

[27]  Christof Lutteroth,et al.  A quantitative quality model for gesture based user interfaces , 2011, OZCHI.

[28]  Evgenia Boutsika,et al.  Kinect in Education: A Proposal for Children with Autism , 2013, DSAI.

[29]  Tim Roberts,et al.  Multi-Kinect Tracking for Dismounted Soldier Training , 2012 .

[30]  Mark D. Dunlop,et al.  The Challenge of Mobile Devices for Human Computer Interaction , 2002, Personal and Ubiquitous Computing.

[31]  Chilwoo Lee,et al.  HCI(Human Computer Interaction) Using Multi-touch Tabletop Display , 2007, 2007 IEEE Pacific Rim Conference on Communications, Computers and Signal Processing.

[32]  Mark Evans Gestural Interfaces in Learning , 2012 .

[33]  Baining Guo,et al.  Kinect Identity: Technology and Experience , 2011, Computer.

[34]  David G. Lowe,et al.  Robust model-based motion tracking through the integration of search and estimation , 1992, International Journal of Computer Vision.

[35]  Sebastian Zug,et al.  Are laser scanners replaceable by Kinect sensors in robotic applications? , 2012, 2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings.

[36]  Mishra Sunita,et al.  A Study About Role of Multimedia in Early Childhood Education , 2013 .