Parameters affecting interaction with gestures in ubiquitous environments

This paper addresses the lack of ordinary devices for interaction in ubiquitous environments. We focused on the parameters that affect interaction with hand gestures as an alternative input device in ubiquitous environments. Hand and object gestures could change according to the situation it is performed inside. The situation could include the variability in location where gestures performed and the position of doing the gestures, device interacting with, application to control, objects shapes and user preferences. Furthermore, social environment of people and activity could change the gesture shapes. In this research, we studied the effect of context parameters on the gesture shapes and developed a system that can support people by appropriate gesture profiles according to their context. We conducted experiments to see the effect of changing position on the gesture shapes. We also applied some social parameters on the subjects while they were doing the experiments. The results showed us that when subjects changed their situation, some of the gestures fail to perform. Moreover, when subjects selected some appropriate objects for interaction they can interact more accurate and fast with their environment.

[1]  Michel Beaudouin-Lafon,et al.  Charade: remote control of objects using free-hand gestures , 1993, CACM.

[2]  Peter Robinson,et al.  The use of gestures in multimodal input , 1998, Assets '98.

[3]  Albrecht Schmidt,et al.  There is more to context than location , 1999, Comput. Graph..

[4]  Irene Mavrommati,et al.  The evolution of objects into hyper-objects: will it be mostly harmless? , 2003, Personal and Ubiquitous Computing.

[5]  Irene Mavrommati,et al.  An architecture that treats everyday objects as communicating tangible components , 2003, Proceedings of the First IEEE International Conference on Pervasive Computing and Communications, 2003. (PerCom 2003)..

[6]  Claudia Linnhoff-Popien,et al.  A Context Modeling Survey , 2004 .

[7]  Ravin Balakrishnan,et al.  VisionWand: interaction techniques for large displays using a passive wand tracked in 3D , 2004, SIGGRAPH 2004.

[8]  Jani Mäntyjärvi,et al.  Accelerometer-based gesture control for a design environment , 2006, Personal and Ubiquitous Computing.

[9]  Michael L. Littman,et al.  Activity Recognition from Accelerometer Data , 2005, AAAI.

[10]  Tatsuo Nakajima,et al.  Augmenting everyday life with sentient artefacts , 2005, sOc-EUSAI '05.

[11]  Holger Kenn,et al.  Towards implicit interaction by using wearable interaction device sensors for more than one task , 2006, Mobility '06.

[12]  Alex Pentland,et al.  InSense: Interest-Based Life Logging , 2006, IEEE MultiMedia.

[13]  Tatsuo Nakajima How to reuse exisiting interactive applications in ubiquitous computing environments? , 2006, SAC '06.

[14]  Jonna Häkkilä,et al.  Tap input as an embedded interaction method for mobile devices , 2007, TEI.

[15]  Olaf Drögehorn,et al.  Making a case for situation-dependent user profiles in context-aware environments , 2007, MNCNA '07.

[16]  Jiro Tanaka,et al.  Coin Size Wireless Sensor Interface for Interaction with Remote Displays , 2007, HCI.

[17]  Pei-Yu Chi,et al.  Designing Smart Living Objects - Enhancing vs. Distracting Traditional Human-Object Interaction , 2007, HCI.

[18]  Tatsuo Nakajima,et al.  Persona: a portable tool for augmenting proactive applications with multimodal personalization support , 2007, MUM.

[19]  Pattie Maes,et al.  Augmenting Looking, Pointing and Reaching Gestures to Enhance the Searching and Browsing of Physical Objects , 2007, Pervasive.

[20]  Paulo Barthelmess,et al.  Workshop on tagging, mining and retrieval of human related activity information , 2007, ICMI '07.

[21]  Martin Kurze Personalization in multimodal interfaces , 2007, TMR '07.

[22]  Luca Benini,et al.  A smart wireless glove for gesture interaction , 2008, SIGGRAPH '08.

[23]  Ig-Jae Kim,et al.  Automatic Lifelog media annotation based on heterogeneous sensor fusion , 2008, 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems.

[24]  Yoo-Joo Choi,et al.  SmartBuckle: human activity recognition using a 3-axis accelerometer and a wearable camera , 2008, HealthNet '08.

[25]  Olaf Drögehorn,et al.  User Profile Selection by Means of Ontology Reasoning , 2008, 2008 Fourth Advanced International Conference on Telecommunications.

[26]  Zoltán Prekopcsák,et al.  Design and development of an everyday hand gesture interface , 2008, Mobile HCI.

[27]  Stephen A. Brewster,et al.  Gestures all around us: user differences in social acceptability perceptions of gesture based interfaces , 2009, Mobile HCI.

[28]  Jiro Tanaka,et al.  UbiGesture: Customizing and Profiling Hand Gestures in Ubiquitous Environment , 2009, HCI.

[29]  Sy-Yen Kuo,et al.  iCon: utilizing everyday objects as additional, auxiliary and instant tabletop controllers , 2010, CHI.

[30]  Jiro Tanaka,et al.  Smart Gesture Sticker: Smart Hand Gestures Profiles for Daily Objects Interaction , 2010, 2010 IEEE/ACIS 9th International Conference on Computer and Information Science.