Multi-user Pointing and Gesture Interaction for Large Screen Using Infrared Emitters and Accelerometers

This paper presents PlusControl, a novel multi-user interaction system for cooperative work with large screen. This system is designed for a use with economic deictic and control gestures in air and it allows free mobility in the environment to the users. PlusControl consists in light worn devices with infrared emitters and Bluetooth accelerometers. In this paper the architecture of the system is presented. A prototype has been developed in order to test and evaluate the system performances. Results show that PlusControl is a valuable tool in cooperative scenarios.

[1]  村田 和義,et al.  HCI International 2009 , 2009, Universal Access in the Information Society.

[2]  Ying Yin,et al.  Toward natural interaction in the real world: real-time gesture recognition , 2010, ICMI-MLMI '10.

[3]  Zhen Wang,et al.  uWave: Accelerometer-based Personalized Gesture Recognition and Its Applications , 2009, PerCom.

[4]  Niels Henze,et al.  Gesture recognition with a Wii controller , 2008, TEI.

[5]  Liang Zhang,et al.  NALP: Navigating Assistant for Large Display Presentation Using Laser Pointer , 2008, First International Conference on Advances in Computer-Human Interaction.

[6]  A BoltRichard,et al.  Put-that-there , 1980 .

[7]  Meredith Ringel Morris,et al.  Multiple mouse text entry for single-display groupware , 2010, CSCW '10.

[8]  Patrick Baudisch,et al.  Interacting with large displays , 2006, Computer.

[9]  L. Benini,et al.  A low-power motion capture system with integrated accelerometers [gesture recognition applications] , 2004, First IEEE Consumer Communications and Networking Conference, 2004. CCNC 2004..

[10]  Jani Mäntyjärvi,et al.  Accelerometer-based gesture control for a design environment , 2006, Personal and Ubiquitous Computing.

[11]  Wolfgang Stuerzlinger,et al.  Laser Pointers as Collaborative Pointing Devices , 2002, Graphics Interface.

[12]  Xing Chen,et al.  Lumipoint: multi-user laser-based interaction on large tiled displays , 2002 .

[13]  Kiyoshi Kiyokawa,et al.  Evaluation of a Pointing Interface for a Large Screen Based on Regression Model with Image Features , 2010, 2010 3rd International Conference on Human-Centric Computing.

[14]  Gerald D. Morrison A camera-based input device for large interactive displays , 2005, IEEE Computer Graphics and Applications.

[15]  Fabio Paternò,et al.  Human-Computer Interaction - INTERACT 2005 , 2005, Lecture Notes in Computer Science.

[16]  Kentaro Fukuchi A Laser Pointer/Laser Trails Tracking System for Visual Performance , 2005, INTERACT.

[17]  Sungdo Ha,et al.  Computer-Human Interaction, 8th Asia-Pacific Conference, APCHI 2008, Seoul, Korea, July 6-9, 2008, Proceedings , 2008, Asia-Pacific Computer and Human Interaction.

[18]  Hao Jiang,et al.  Direct pointer: direct manipulation for large-display interaction using handheld cameras , 2006, CHI.

[19]  Julie A. Jacko Ambient, ubiquitous and intelligent interaction , 2009 .

[20]  Stéphane Pierroz,et al.  Generic Framework for Transforming Everyday Objects into Interactive Surfaces , 2009, HCI.

[21]  Li-Chen Fu,et al.  Gesture stroke recognition using computer vision and linear accelerometer , 2008, 2008 8th IEEE International Conference on Automatic Face & Gesture Recognition.

[22]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[23]  Jani Mäntyjärvi,et al.  Enabling fast and effortless customisation in accelerometer based gesture interaction , 2004, MUM '04.

[24]  Takashi Nakamura,et al.  Double-Crossing: A New Interaction Technique for Hand Gesture Interfaces , 2008, APCHI.

[25]  Mohammed Yeasin,et al.  A real-time framework for natural multimodal interaction with large screen displays , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.