AmbiLearn: Ambient Intelligent Multimodal Learning Environment for Children

Multimodality has real potential to improve interaction between computers and people. Research in the fields of pervasive computing and ambient intelligence hasresulted in multimodal interfaces that enable communication between technology and people to be more natural and efficient. With many organisations ‘harnessingtechnology’ for educational and entertaining purposes, there is a growing interest in the field of edutainment software for children. Interactive games provide a suitablemeans for producing a motivational and enjoyable learningenvironment. This paper presents preliminary work on thedevelopment of an ambient intelligent multimodal learningenvironment for children (AmbiLearn) and an edutainment game(TreasureLearn).

[1]  Kristinn R. Thórisson Gandalf: an embodied humanoid capable of real-time multimodal dialogue with people , 1997, AGENTS '97.

[2]  Dieter Schmalstieg,et al.  Handheld AR for Collaborative Edutainment , 2006, ICAT.

[3]  Thomas B. Moeslund,et al.  The IntelliMedia WorkBench - An Environment for Building Multimodal Systems , 1998, Cooperative Multimodal Communication.

[4]  Wolfgang Wahlster,et al.  Dialogue Systems Go Multimodal: The SmartKom Experience , 2006, SmartKom.

[5]  W. Lewis Johnson,et al.  Steve: an animated pedagogical agent for procedural training in virtual environments , 1997, SGAR.

[6]  W. Lewis Johnson,et al.  Animated Agents for Procedural Training in Virtual Reality: Perception, Cognition, and Motor Control , 1999, Appl. Artif. Intell..

[7]  Michael Lipinski,et al.  Feasibility of the Nintendo DS for teaching problem-based learning in kindergarten through twelfth grade students , 2008 .

[8]  Maria del Puy Carretero,et al.  Elderly Users in Ambient Intelligence: Does an Avatar Improve the Interaction? , 2006, Universal Access in Ambient Intelligence Environments.

[9]  Masahiro Araki,et al.  Spoken, Multilingual and Multimodal Dialogue Systems: Development and Assessment , 2005 .

[10]  C. Stephanidis,et al.  Universal Access in Ambient Intelligence Environments , 2008 .

[11]  Philip R. Cohen The role of natural language in a multimodal interface , 1992, UIST '92.

[12]  Andy Pulman Can a handheld gaming device be used as an effective assistive technology tool? , 2007, Br. J. Educ. Technol..

[13]  Peter Lonsdale,et al.  WP 4 - GUIDELINES FOR LEARNING/TEACHING/TUTORING IN A MOBILE ENVIRONMENT , 2003 .

[14]  R. Mayer,et al.  Interactive Multimodal Learning Environments , 2007 .

[15]  Philip R. Cohen,et al.  QuickSet: multimodal interaction for distributed applications , 1997, MULTIMEDIA '97.

[16]  Bottoni Paolo,et al.  Working Conference on Advanced Visual Interfaces , 2008, AVI 2008.

[17]  Wolfgang Wahlster,et al.  SmartKom: Foundations of Multimodal Dialogue Systems , 2006, SmartKom.

[18]  Mark T. Maybury,et al.  Intelligent multimedia interfaces , 1994, CHI Conference Companion.

[19]  Oliviero Stock,et al.  Multimodal intelligent information presentation , 2005 .

[20]  Sharon Oviatt,et al.  Multimodal Interfaces , 2008, Encyclopedia of Multimedia.

[21]  Michael Miller,et al.  Cloud Computing: Web-Based Applications That Change the Way You Work and Collaborate Online , 2008 .

[22]  J. Cassell,et al.  Communicative humanoids: a computational model of psychosocial dialogue skills , 1996 .

[23]  Christopher Abbott Harnessing Technology for Every Child Matters and Personalised Learning , 2009 .

[24]  Mitsuru Ishizuka,et al.  Understanding the effect of life-like interface agents through users' eye movements , 2005, ICMI '05.

[25]  Wolfgang Wahlster,et al.  SmartKom: Towards Multimodal Dialogues with Anthropomorphic Interface Agents , 2001 .

[26]  Raimund Schatz,et al.  Multimodal interfaces in mobile devices–the mona project , 2004 .

[27]  James W. Davis,et al.  Perceptual user interfaces: the KidsRoom , 2000, CACM.

[28]  Matthias Rauterberg,et al.  Entertainment Computing – ICEC 2007 , 2007, Lecture Notes in Computer Science.

[29]  Sharon L. Oviatt,et al.  Designing the User Interface for Multimodal Speech and Pen-Based Gesture Applications: State-of-the-Art Systems and Future Research Directions , 2000, Hum. Comput. Interact..

[30]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[31]  María Santamaría Web navigation recorder eyetracking software application demo , 2003, IDC '03.

[32]  Hidir Aras,et al.  SmartKom mobile: intelligent ubiquitous user interaction , 2004, IUI '04.

[33]  Mark Witkowski,et al.  Evaluating User Reaction to Character Agent Mediated Displays using Eye-tracking Technology , 2004 .

[34]  Thomas B. Moeslund,et al.  Developing Intelligent MultiMedia applications , 2002 .

[35]  Mark T. Maybury,et al.  Intelligent user interfaces: an introduction , 1998, IUI '99.

[36]  Yukiko I. Nakano,et al.  Non-Verbal Cues for Discourse Structure , 2022 .

[37]  Jean-Claude Martin,et al.  Children's and adults' multimodal interaction with 2D conversational agents , 2005, CHI EA '05.

[38]  Matthias Lampe,et al.  Interactive Educational Play with Augmented Toy Environments , 2007, ERCIM News.

[39]  K. Chang,et al.  Embodiment in conversational interfaces: Rea , 1999, CHI '99.

[40]  Glenda Revelle,et al.  Slow and steady wins the race? Three-year-old children and pointing device use , 1996, Behav. Inf. Technol..

[41]  Kristinn R. Thórisson,et al.  Integrating Simultaneous Input from Speech, Gaze, and Hand Gestures , 1991, AAAI Workshop on Intelligent Multimedia Interfaces.

[42]  Niels Ole Bernsen,et al.  Animating an interactive conversational character for an educational game system , 2005, IUI.

[43]  Charles Rich,et al.  A plug-in architecture for generating collaborative agent responses , 2002, AAMAS '02.

[44]  Koray Balci Xface: MPEG-4 based open source toolkit for 3D Facial Animation , 2004, AVI.

[45]  James L. Flanagan,et al.  Multimodal interaction on PDA's integrating speech and pen inputs , 2003, INTERSPEECH.

[46]  Niels Ole Bernsen,et al.  Fusion of children's speech and 2D gestures when conversing with 3D characters , 2006, Signal Process..

[47]  James W. Davis,et al.  The KidsRoom: A Perceptually-Based Interactive and Immersive Story Environment , 1999, Presence.

[48]  Sharon L. Oviatt,et al.  Ten myths of multimodal interaction , 1999, Commun. ACM.

[49]  A. McFarlane,et al.  Report on the educational use of games , 2002 .

[50]  Paul McKevitt,et al.  Integration of Natural Language and Vision Processing , 1996, Springer Netherlands.

[51]  Brad A. Myers,et al.  Handheld computing , 2003 .

[52]  Allison Druin,et al.  The role of children in the design of new technology , 2002 .

[53]  Shrikanth S. Narayanan,et al.  Spoken dialog systems for children , 1998, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).

[54]  Jean-Bernard Martens,et al.  Read-It: A Multi-modal Tangible Interface for Children Who Learn to Read , 2004, ICEC.

[55]  Michael F. McTear,et al.  Book Review , 2005, Computational Linguistics.

[56]  Lori L. Scarlatos TICLE: using multimedia multimodal guidance to enhance learning , 2002, Inf. Sci..

[57]  Ximena López,et al.  Beyond Nintendo: design and assessment of educational video games for first and second grade students , 2003, Comput. Educ..

[58]  Jacques R. Charlier,et al.  VISIOBOARD: a new gaze command system for handicapped subjects , 1997 .

[59]  Shrikanth S. Narayanan,et al.  Creating conversational interfaces for children , 2002, IEEE Trans. Speech Audio Process..

[60]  Stuart MacFarlane,et al.  Using the fun toolkit and other survey methods to gather opinions in child computer interaction , 2006, IDC '06.

[61]  Regan L. Mandryk,et al.  Understanding children's collaborative interactions in shared environments , 2003, J. Comput. Assist. Learn..

[62]  Michael F. McTear,et al.  Software to support research and development of spoken dialogue systems , 1999, EUROSPEECH.

[63]  Dmitry Zaykovskiy,et al.  Survey of the Speech Recognition Techniques for Mobile Devices , 2006 .

[64]  Pierre Dillenbourg,et al.  Virtual Learning Environments , 2020, The SAGE Encyclopedia of Higher Education.

[65]  Lesia L. Crumpton,et al.  Using virtual reality as a tool to enhance classroom instruction , 1997 .

[66]  Niels Ole Bernsen,et al.  Designing interactive speech systems - from first ideas to user testing , 1998 .

[67]  Crispin Dale,et al.  Podagogy: The iPod as a Learning Technology. , 2009 .

[68]  Kristinn R. Thórisson,et al.  Integrated A.I. systems , 2007, Minds and Machines.

[69]  Raimund Schatz,et al.  Device independent mobile multimodal user interfaces with the MONA Multimodal Presentation Server , 2005 .

[70]  Francisco Luis Gutiérrez Vela,et al.  Using Videogames in Special Education , 2007, EUROCAST.

[71]  Worthy N. Martin,et al.  Human-computer interaction using eye-gaze input , 1989, IEEE Trans. Syst. Man Cybern..

[72]  James C. Lester,et al.  The persona effect: affective impact of animated pedagogical agents , 1997, CHI.

[73]  Niels Ole Bernsen,et al.  Designing Interactive Speech Systems , 1998, Springer London.

[74]  Janet C. Read,et al.  Evaluating Children's Interactive Products: Principles and Practices for Interaction Designers , 2008 .