GestAnalytics: Experiment and Analysis Tool for Gesture-Elicitation Studies

Gesture-elicitation studies are common and important studies for understanding user preferences. In these studies, researchers aim at extracting gestures which are desirable by users for different kinds of interfaces. During this process, researchers have to manually analyze many videos which is a tiring and a time-consuming process. Although current tools for video analysis provide annotation opportunity and features like automatic gesture analysis, researchers still need to (1) divide videos into meaningful pieces, (2) manually examine each piece, (3) match collected user data with these, (4) code each video and (5) verify their coding. These processes are burdensome and current tools do not aim to make this process easier and faster. To fill this gap, we developed "GestAnalytics" with features of simultaneous video monitoring, video tagging and filtering. Our internal pilot tests show that GestAnalytics can be a beneficial tool for researchers who practice video analysis for gestural interfaces.

[1]  Oğuzhan Özcan,et al.  DubTouch: exploring human to human touch interaction for gaming in double sided displays , 2014, NordiCHI.

[2]  Hennie Brugman,et al.  Annotating Multi-media/Multi-modal Resources with ELAN , 2004, LREC.

[3]  Harry Bunt,et al.  Using DiAML and ANVIL for multimodal dialogue annotations , 2012, LREC.

[4]  Xiangshi Ren,et al.  Designing concurrent full-body gestures for intense gameplay , 2015, Int. J. Hum. Comput. Stud..

[5]  Radu-Daniel Vatavu,et al.  Leap gestures for TV: insights from an elicitation study , 2014, TVX.

[6]  Ying-Chao Tung,et al.  User-Defined Game Input for Smart Glasses in Public Space , 2015, CHI.

[7]  Tilbe Göksun,et al.  Hands as a Controller: User Preferences for Hand Specific On-Skin Gestures , 2017, Conference on Designing Interactive Systems.

[8]  Meredith Ringel Morris,et al.  Understanding users' preferences for surface gestures , 2010, Graphics Interface.

[9]  Anne Marie Piper,et al.  A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface , 2013, IDC.

[10]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[11]  Jürgen Steimle,et al.  More than touch: understanding how people use skin as an input surface for mobile computing , 2014, CHI.

[12]  Tilbe Göksun,et al.  It Made More Sense: Comparison of User-Elicited On-skin Touch and Freehand Gesture Sets , 2017, HCI.

[13]  Karthik Ramani,et al.  GestureAnalyzer: visual analytics for pattern analysis of mid-air hand gestures , 2014, SUI.