Hidden Markov Model based gesture recognition on low-cost, low-power Tangible User Interfaces

The development of new human–computer interaction technologies that go beyond traditional mouse and keyboard is gaining momentum as smart interactive spaces and virtual reality are becoming part of our everyday life. Tangible User Interfaces (TUIs) introduce physical objects that people can manipulate to interact with smart spaces. Smart objects used as TUIs can further improve the user experiences by recognizing and coupling natural gesture to command issued to the computing system. Hidden Markov Models (HMM) are a typical approach to recognize gestures. In this paper, we show how the HMM forward algorithm can be adapted for its use on low-power, low-cost microcontrollers without floating point unit that can be embedded into several TUI. The proposed solution is validated on a set of gestures performed with the Smart Micrel Cube (SMCube), a TUI developed within the TANGerINE framework. Through the paper we evaluate the complexity of the algorithm and the performance of the recognition algorithm as a function of the number of bits used to represent data. Furthermore, we explore a multiuser scenario where up to four people share the same cube. Results show that the proposed solution performs comparably to the standard forward algorithm run on a PC with double-precision floating point calculations.

[1]  Changseok Bae,et al.  User activity recognition and logging in distributed Intelligent Gadgets , 2008, 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems.

[2]  Jiao Jian-li Literature Review in Learning with Tangible Technologies , 2008 .

[3]  V. Baier,et al.  Gesture Classification with Hierarchically Structured Recurrent Self-Organizing Maps , 2007, 2007 Fourth International Conference on Networked Sensing Systems.

[4]  Ludwig Zeller,et al.  CubeBrowser: a cognitive adapter to explore media databases , 2009, CHI Extended Abstracts.

[5]  Luca Benini,et al.  Introducing tangerine: a tangible interactive natural environment , 2007, ACM Multimedia.

[6]  Yoshifumi Kitamura,et al.  Distributed autonomous interface using ActiveCube for interactive multimedia contents , 2005, ICAT '05.

[7]  Günter Hommel,et al.  Velocity Profile Based Recognition of Dynamic Gestures with Discrete Hidden Markov Models , 1997, Gesture Workshop.

[8]  wikiTable: finger driven interaction for collaborative knowledge-building workspaces , 2006, 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'06).

[9]  Joydeep Ghosh,et al.  HMMs and Coupled HMMs for multi-channel EEG classification , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[10]  Steve Mann,et al.  Personal imaging , 1997 .

[11]  Chun-Rong Huang,et al.  Tangible Photorealistic Virtual Museum , 2005, IEEE Computer Graphics and Applications.

[12]  P. Lukowicz,et al.  From Sensors to Miniature Networked SensorButtons , 2006 .

[13]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[14]  James H. Aylor,et al.  Computer for the 21st Century , 1999, Computer.

[15]  Kyoung Shin Park,et al.  Learning Cooperation in a Tangible Moyangsung , 2007, HCI.

[16]  S. Venkatesh,et al.  Online Context Recognition in Multisensor Systems using Dynamic Time Warping , 2005, 2005 International Conference on Intelligent Sensors, Sensor Networks and Information Processing.

[17]  Ali Mazalek,et al.  TViews: An Extensible Architecture for Multiuser Digital Media Tables , 2006, IEEE Computer Graphics and Applications.

[18]  Anthony Rowe,et al.  eWatch: context sensitive system design case study , 2005, IEEE Computer Society Annual Symposium on VLSI: New Frontiers in VLSI Design (ISVLSI'05).

[19]  Nicolas Pérez de la Blanca,et al.  Applying Space State Models in Human Action Recognition: A Comparative Study , 2008, AMDO.

[20]  Eamonn J. Keogh,et al.  An online algorithm for segmenting time series , 2001, Proceedings 2001 IEEE International Conference on Data Mining.

[21]  Woontack Woo,et al.  Manipulating Multimedia Contents with Tangible Media Control System , 2004, ICEC.

[22]  Albrecht Schmidt,et al.  A Display Cube as a Tangible User Interface , 2005 .

[23]  Ross Bencina,et al.  reacTIVision: a computer-vision framework for table-based tangible interaction , 2007, TEI.

[24]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[25]  Yoshifumi Kitamura,et al.  m-ActiveCube; Multimedia Extension of Spatial Tangible User Interface , 2006, BioADIT.

[26]  Hiroshi Ishii,et al.  The tangible user interface and its evolution , 2008, CACM.

[27]  Luca Benini,et al.  Tangerine SMCube : a smart device for human computer interaction , 2008 .

[28]  Paul Lukowicz,et al.  Combining Motion Sensors and Ultrasonic Hands Tracking for Continuous Activity Recognition in a Maintenance Scenario , 2006, 2006 10th IEEE International Symposium on Wearable Computers.

[29]  Jenq-Neng Hwang,et al.  Robust speech recognition based on joint model and feature space optimization of hidden Markov models , 1997, IEEE Trans. Neural Networks.

[30]  L. Benini,et al.  Activity recognition from on-body sensors by classifier fusion: sensor scalability and robustness , 2007, 2007 3rd International Conference on Intelligent Sensors, Sensor Networks and Information.

[31]  Haihong Hu,et al.  Factorial HMM and Parallel HMM for Gait Recognition , 2009, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[32]  Matthias Rauterberg,et al.  The digital playing desk: a case study for augmented reality , 1996, Proceedings 5th IEEE International Workshop on Robot and Human Communication. RO-MAN'96 TSUKUBA.

[33]  Miguel Bruns Alonso,et al.  MusicCube: a physical experience with digital music , 2005, Personal and Ubiquitous Computing.

[34]  Ellen Yi-Luen Do,et al.  Navigational blocks: tangible navigation of digital information , 2002, CHI Extended Abstracts.

[35]  Gilles Bailly,et al.  ARemote: A Tangible Interface for Selecting TV Channels , 2007, 17th International Conference on Artificial Reality and Telexistence (ICAT 2007).

[36]  Willem Fontijn,et al.  On the Benefits of Tangible Interfaces for Educational Games , 2008, 2008 Second IEEE International Conference on Digital Game and Intelligent Toy Enhanced Learning.

[37]  Steve Mann,et al.  “Smart clothing”: wearable multimedia computing and “personal imaging” to restore the technological balance between people and their environments , 1997, MULTIMEDIA '96.

[38]  Gerhard Tröster,et al.  Real time gesture recognition using continuous time recurrent neural networks , 2007, BODYNETS.

[39]  Paul Lukowicz,et al.  Wearable Activity Tracking in Car Manufacturing , 2008, IEEE Pervasive Computing.

[40]  Laehyun Kim,et al.  A Tangible User Interface with Multimodal Feedback , 2007, HCI.

[41]  Luca Benini,et al.  Design and implementation of WiMoCA node for a body area wireless sensor network , 2005, 2005 Systems Communications (ICW'05, ICHSN'05, ICMCS'05, SENET'05).

[42]  Sergi Jordà,et al.  The reacTable: a tangible tabletop musical instrument and collaborative workbench , 2006, SIGGRAPH '06.