User centered gesture development for smart lighting

The aim of this study is to investigate hand gesture expression and to understand it when controlling smart lighting system. The technology development has brought us the smart device which we can control multifunction of one or more systems. In order to fully utilize the functions, however, we need Natural User Interface (NUI) and intuitive control hand gesture. Therefore we conducted an experiment of hand gesture expression on 20 subjects to investigate and identify what kinds of hand gestures can be used to control the smart lighting system. The results identify categorization of hand gestures into three types. Also, differences between the gesture types were identified. In addition, the use of hand such as dominant hand and both hands affect the capability of people to express the hand gesture. This preliminary study can identify many important issues regarding hand gesture based interface for the smart lighting system. It can be further improved by verifications with consideration of context of use, preference test of hand, application of different devices or systems, and gesture evaluation.

[1]  Sebastian Möller,et al.  I'm home: Defining and evaluating a gesture set for smart-home control , 2011, Int. J. Hum. Comput. Stud..

[2]  Anton Nijholt,et al.  User-Evaluated Gestures for Touchless Interactions from a Distance , 2010, 2010 IEEE International Symposium on Multimedia.

[3]  Y. Matsumotot,et al.  Development of intelligent wheelchair system with face and gaze based interface , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[4]  Ehud Sharlin,et al.  Exploring the use of tangible user interfaces for human-robot interaction: a comparative study , 2008, CHI.

[5]  Thomas B. Moeslund,et al.  A procedure for developing intuitive and ergonomic gesture interfaces for man-machine interaction , 2003 .

[6]  Donghun Lee,et al.  Towards successful user interaction with systems: focusing on user-derived gestures for smart home systems. , 2014, Applied ergonomics.

[7]  William V. Baxter,et al.  DAB: Interactive Haptic Painting with 3D Virtual Brushes , 2001, SIGGRAPH Courses.

[8]  Yi Li,et al.  Features extraction from hand images based on new detection operators , 2011, Pattern Recognit..

[9]  J Rhyne Dialogue management for gestural interfaces , 1987, COMG.

[10]  Jason I. Hong,et al.  Wave to me: user identification using body lengths and natural gestures , 2014, CHI.

[11]  Paul Lukowicz,et al.  Waving Real Hand Gestures Recorded by Wearable Motion Sensors to a Virtual Car and Driver in a Mixed-Reality Parking Game , 2007, 2007 IEEE Symposium on Computational Intelligence and Games.

[12]  Vivek K. Goyal,et al.  Mime: compact, low power 3D gesture sensing for interaction with head mounted displays , 2013, UIST.

[13]  Niels Henze,et al.  Free-hand gestures for music playback: deriving gestures with a user-centred process , 2010, MUM.

[14]  Patrick Langdon,et al.  Physical gestures for abstract concepts: Inclusive design with primary metaphors , 2010, Interact. Comput..

[15]  Niels Henze,et al.  User-centred process for the definition of free-hand gestures applied to controlling music playback , 2012, Multimedia Systems.

[16]  Eun-Jung Choi,et al.  Design of Hand Gestures for Smart Home Appliances based on a User Centered Approach , 2012 .

[17]  Joseph J. LaViola,et al.  Exploring the usefulness of finger-based 3D gesture menu selection , 2014, CHI.

[18]  Jianmin Wang,et al.  User-centered gesture development in TV viewing environment , 2014, Multimedia Tools and Applications.

[19]  Peter Robinson,et al.  The use of gestures in multimodal input , 1998, Assets '98.

[20]  Marcelo Knörich Zuffo,et al.  On the usability of gesture interfaces in virtual reality environments , 2005, CLIHC '05.

[21]  Peng Song,et al.  A handle bar metaphor for virtual object manipulation with mid-air interaction , 2012, CHI.

[22]  Fumio Kanehiro,et al.  Robust speech interface based on audio and video information fusion for humanoid HRP-2 , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).