Developing 3D Freehand Gesture-Based Interaction Methods for Virtual Walkthroughs: Using an Iterative Approach

Gesture-based 3D interaction has been considered a relevant research topic as it has a natural application in several scenarios. Yet, it presents several challenges due to its novelty and consequential lack of systematic development methodologies, as well as to inherent usability related problems. Moreover, it is not always obvious which are the most adequate and intuitive gestures, and users may use a variety of different gestures to perform similar actions. This chapter describes how spatial freehand gesture based navigation methods were developed to be used in virtual walkthroughs meant to be experienced in large displays using a depth sensor for gesture tracking. Several iterations of design, implementation, user tests, and controlled experiments performed as formative and summative evaluation to improve, validate, and compare the methods are presented and discussed.

[1]  Lee Garber Gestural Technology: Moving Interfaces in a New Direction , 2013, Computer.

[2]  Gang Ren,et al.  3D Freehand Gestural Navigation for Interactive Public Displays , 2013, IEEE Computer Graphics and Applications.

[3]  David Lindlbauer,et al.  Rotating, tilting, bouncing: using an interactive chair to promote activity in office environments , 2013, CHI Extended Abstracts.

[4]  B. K. Tripathy,et al.  Image Enhancement Techniques Using Particle Swarm Optimization Technique , 2016 .

[5]  Tao Ni A Framework of Freehand Gesture Interaction: Techniques, Guidelines, and Applications , 2011 .

[6]  Martin Hachet,et al.  Advances in Interaction with 3D Environments , 2015, Comput. Graph. Forum.

[7]  François Bérard,et al.  Bare-hand human-computer interaction , 2001, PUI '01.

[8]  Jakob Nielsen,et al.  Gestural interfaces: a step backward in usability , 2010, INTR.

[9]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[10]  Samuel S. Silva,et al.  Head-mounted display versus desktop for 3D navigation in virtual reality: a user study , 2008, Multimedia Tools and Applications.

[11]  Olivier Christmann,et al.  A seamless solution for 3D real-time interaction: design and evaluation , 2014, Virtual Reality.

[12]  James L. Crowley,et al.  Perceptual user interfaces: things that see , 2000, CACM.

[13]  Bernd Fröhlich,et al.  3D User Interfaces , 2009, IEEE Computer Graphics and Applications.

[14]  Daniel Vogel,et al.  Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.

[15]  Kerstin Fischer,et al.  Multimodal Feedback in Human-Robot Interaction: An HCI-Informed Comparison of Feedback Modalities , 2016 .

[16]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[17]  Yael Edan,et al.  Vision-based hand-gesture applications , 2011, Commun. ACM.

[18]  David Lindlbauer,et al.  A chair as ubiquitous input device: exploring semaphoric chair gestures for focused and peripheral interaction , 2014, CHI.

[19]  Mark Billinghurst,et al.  Hands in Space Gesture Interaction with Augmented-Reality Interfaces , 2013 .

[20]  Sebastian Möller,et al.  I'm home: Defining and evaluating a gesture set for smart-home control , 2011, Int. J. Hum. Comput. Stud..

[21]  Monica M. C. Schraefel,et al.  A study on the use of semaphoric gestures to support secondary task interactions , 2005, CHI EA '05.

[22]  Wolfgang Hürst,et al.  Gesture-based interaction via finger tracking for mobile augmented reality , 2011, Multimedia Tools and Applications.

[23]  Rashid Ansari,et al.  Multimodal human discourse: gesture and speech , 2002, TCHI.

[24]  Pascal Savard,et al.  A comparative study of four input devices for desktop virtual walkthroughs , 2011, Comput. Hum. Behav..

[25]  Myron W. Krueger,et al.  VIDEOPLACE—an artificial reality , 1985, CHI '85.

[26]  Warren Robinett,et al.  Virtual environment display system , 1987, I3D '86.

[27]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[28]  Ravin Balakrishnan,et al.  Exploring bimanual camera control and object manipulation in 3D graphics interfaces , 1999, CHI '99.

[29]  Gang Ren,et al.  3D selection with freehand gesture , 2013, Comput. Graph..

[30]  João Cardoso,et al.  Developing and Evaluating Two Gestural-Based Virtual Environment Navigation Methods for Large Displays , 2015, HCI.

[31]  Joseph A. Paradiso Tracking contact and free gesture across large interactive surfaces , 2003, CACM.

[32]  Abhishek Ranjan,et al.  Interacting with large displays from a distance with vision-tracked multi-finger gestural input , 2005, SIGGRAPH '06.

[33]  Frank Rudzicz,et al.  A framework for 3D visualisation and manipulation in an immersive space using an untethered bimanual gestural interface , 2004, VRST '04.

[34]  Perttu Hämäläinen,et al.  Children's intuitive gestures in vision-based action games , 2005, CACM.

[35]  Paulo Dias,et al.  Student Projects Involving Novel Interaction with Large Displays , 2014, IEEE Computer Graphics and Applications.

[36]  Joseph J. LaViola,et al.  An introduction to 3D gestural interfaces , 2014, SIGGRAPH '14.