Agreement Study Using Gesture Description Analysis

Choosing adequate gestures for touchless interfaces is a challenging task that has a direct impact on human–computer interaction. Such gestures are commonly determined by the designer, ad-hoc, rule-based, or agreement-based methods. Previous approaches to assess agreement grouped the gestures into equivalence classes and ignored the integral properties that are shared between them. In this article, we propose a generalized framework that inherently incorporates the gesture descriptors into the agreement analysis. In contrast to previous approaches, we represent gestures using binary description vectors and allow them to be partially similar. In this context, we introduce a new metric referred to as soft agreement rate (<inline-formula><tex-math notation="LaTeX">$\mathcal {SAR}$</tex-math></inline-formula>) to measure the level of agreement and provide a mathematical justification for this metric. Furthermore, we perform computational experiments to study the behavior of <inline-formula><tex-math notation="LaTeX">$\mathcal {SAR}$</tex-math></inline-formula> and demonstrate that existing agreement metrics are a special case of our approach. Our method is evaluated and tested through a guessability study conducted with a group of neurosurgeons. Nevertheless, our formulation can be applied to any other user-elicitation study. Results show that the level of agreement obtained by <inline-formula><tex-math notation="LaTeX">$\mathcal {SAR}$</tex-math></inline-formula> is 2.64 times higher than the previous metrics. Finally, we show that our approach complements the existing agreement techniques by generating an artificial lexicon based on the most agreed properties.

[1]  Michael Kipp,et al.  Gesture generation by imitation: from human behavior to computer character animation , 2005 .

[2]  Yael Edan,et al.  Vision-based hand-gesture applications , 2011, Commun. ACM.

[3]  Y. Benjamini,et al.  Controlling the false discovery rate: a practical and powerful approach to multiple testing , 1995 .

[4]  Anne Köpsel,et al.  Benefiting from legacy bias , 2015, Interactions.

[5]  Michael J. Muller,et al.  Participatory design , 1993, CACM.

[6]  Anne Marie Piper,et al.  A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface , 2013, IDC.

[7]  Matthew Stone,et al.  A Formal Semantic Analysis of Gesture , 2009, J. Semant..

[8]  Bernhard Preim,et al.  Exploration of 3D Medical Image Data for Interventional Radiology using Myoelectric Gesture Control , 2015, VCBM.

[9]  Meredith Ringel Morris,et al.  Web on the wall: insights from a multimodal interaction elicitation study , 2012, ITS.

[10]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[11]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[12]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[13]  Lee Garber Gestural Technology: Moving Interfaces in a New Direction , 2013, Computer.

[14]  Radu-Daniel Vatavu,et al.  User-defined gestures for free-hand TV control , 2012, EuroITV.

[15]  Yael Edan,et al.  Optimal Consensus Intuitive Hand Gesture Vocabulary Design , 2008, 2008 IEEE International Conference on Semantic Computing.

[16]  Howell O. Istance,et al.  Designing gaze gestures for gaming: an investigation of performance , 2010, ETRA.

[17]  Abdulmotaleb El-Saddik,et al.  An Elicitation Study on Gesture Preferences and Memorability Toward a Practical Hand-Gesture Vocabulary for Smart Televisions , 2015, IEEE Access.

[18]  Radu-Daniel Vatavu,et al.  Between-Subjects Elicitation Studies: Formalization and Tool Support , 2016, CHI.

[19]  Jr. W.A. Chren,et al.  One-hot residue coding for low delay-power product CMOS design , 1998 .

[20]  Gianluca Giorgolo A Formal Semantics for Iconic Spatial Gestures , 2009, Amsterdam Colloquium on Logic, Language and Meaning.

[21]  Teddy Seyed,et al.  User Elicitation on Single-hand Microgestures , 2016, CHI.

[22]  Michael Neff,et al.  An annotation scheme for conversational gestures: how to economically capture timing and form , 2007, Lang. Resour. Evaluation.

[23]  Naveen Madapana,et al.  Gestures for Picture Archiving and Communication Systems (PACS) operation in the operating room: Is there any standard? , 2018, PloS one.

[24]  Bongshin Lee,et al.  Reducing legacy bias in gesture elicitation studies , 2014, INTR.

[25]  Junsong Yuan,et al.  Robust Part-Based Hand Gesture Recognition Using Kinect Sensor , 2013, IEEE Transactions on Multimedia.

[26]  Paul Cohen,et al.  A Vision-Based Gestural Guidance Interface for Mobile Robotic Platforms , 2004, ECCV Workshop on HCI.

[27]  Juan Pablo Wachs,et al.  Hand-gesture-based sterile interface for the operating room using contextual cues for the navigation of radiological images. , 2013, Journal of the American Medical Informatics Association : JAMIA.

[28]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[29]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[30]  S. Niwattanakul,et al.  Using of Jaccard Coefficient for Keywords Similarity , 2022 .

[31]  Yong Wang,et al.  Using human body gestures as inputs for gaming via depth analysis , 2008, 2008 IEEE International Conference on Multimedia and Expo.

[32]  Juan Pablo Wachs,et al.  Gestonurse: A multimodal robotic scrub nurse , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[33]  Clay Spinuzzi,et al.  The Methodology of Participatory Design , 2005 .

[34]  Christina Boucher,et al.  Exploring Non-touchscreen Gestures for Smartwatches , 2016, CHI.

[35]  H. Stern,et al.  A gesture-based tool for sterile browsing of radiology images. , 2008, Journal of the American Medical Informatics Association : JAMIA.