A Holistic Framework for Hand Gestures Design

Hand gesture based interfaces are a proliferating area for immersive and augmented reality systems due to the rich interaction provided by this type of modality. Even though proper design of such interfaces requires accurate recognition, usability, ergonomic design and comfort. In most of the interfaces being developed the primary focus is on accurate gesture recognition. Formally, an optimal hand gesture vocabulary (GV), can be defined as a set of gesture-command associations, such that the time τ to perform a task is minimized over all possible hand gestures in our ontology. In this work, we consider three different cost functions as proxies to task completion time: intuitiveness Z1(GV), comfort Z2(GV) and recognition accuracy Z3(GV). Hence, we can establish that Max(Zi)(GV): i=1,2,3) over all GV’s is our multiobjective problem (MOP). Because finding the solutions to the MOP requires a large amount of computation time, an analytical methodology is proposed in which the MOP is converted to a dual priority objective problem where recognition accuracy is considered of prime importance, and the human performance objectives are secondary. This work, as opposed to previous research done by the authors, is focused on two aspects: First,a modified cost function for an enhanced simulated annealing approach is explained and implementation issues are discussed. Second, a comparative study is performed between hand gesture vocabularies obtained using the methodology suggested, and vocabularies hand picked by individuals.. The superiority of our method is demonstrated in the context of a robotic vehicle control task using hand gestures.

[1]  Xia Liu,et al.  Hand gesture recognition using depth data , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[2]  Yael Edan,et al.  Cluster labeling and parameter estimation for the automated setup of a hand-gesture recognition system , 2005, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[3]  Mathias Kölsch,et al.  An Objective Measure for Postural Comfort , 2003 .

[4]  S.M. Arisona,et al.  SQEAK: A Mobile Multi Platform Phone and Networks Gesture Sensor , 2007, 2007 2nd International Conference on Pervasive Computing and Applications.

[5]  N Berthouze,et al.  Learning to recognize affective body postures , 2003, The 3rd International Workshop on Scientific Use of Submarine Cables and Related Technologies, 2003..

[6]  Sébastien Marcel,et al.  Hand gesture recognition using input-output hidden Markov models , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[7]  David Connolly An improved annealing scheme for the QAP , 1990 .

[8]  Alexander G. Hauptmann,et al.  Gestures with Speech for Graphic Manipulation , 1993, Int. J. Man Mach. Stud..

[9]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[10]  Naoyuki Kubota,et al.  Evolutionary robot vision for human tracking of partner robots in ambient intelligence , 2007, 2007 IEEE Congress on Evolutionary Computation.

[11]  Yael Edan,et al.  Designing Hand Gesture Vocabularies for Natural Interaction by Combining Psycho-Physiological and Recognition Factors , 2008, Int. J. Semantic Comput..

[12]  Kwang-Seok Hong,et al.  Multi-Modal Recognition System Integrating Fuzzy Logic-based Embedded KSSL Recognizer and Voice-XML , 2006, 2006 IEEE International Conference on Fuzzy Systems.

[13]  Stefan Müller,et al.  Hand Gesture Recognition with a Novel IR Time-of-Flight Range Camera-A Pilot Study , 2007, MIRAGE.

[14]  Catherine G. Wolf,et al.  The Use of Hand-Drawn Gestures for Text Editing , 1987, Int. J. Man Mach. Stud..

[15]  Jun-Hyeong Do,et al.  Soft remote control system in the intelligent sweet home , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.