Free-hand Gesture Interfaces for an Augmented Exhibition Podium

In this paper we present an augmented exhibition podium that supports natural free-hand 3D interaction for visitors using their own mobile phones or Smart Glasses. Visitors can point the camera of their mobile phones or Smart Glasses at the podium to see Augmented Reality (AR) content overlaid on a physical exhibit, and can also use their free-hand gestures to interact with the AR content. For instance, they can use pinching gestures to select different parts of the exhibit with their fingers to view augmented text descriptions, instead of touching the mobile phone screen. The prototype combines vision-based image tracking and free-hand gesture detection via a depth camera in a client-server framework, which enables users to use their hands with the augmented exhibition without requiring special hardware (e.g. a depth sensor) on their personal devices. Results from our pilot user study shows that the prototype system is as intuitive to use as a traditional touch-based interface, and provides a more fun and engaging experience.

[1]  Sean Gustafson,et al.  PinchWatch: A Wearable Device for One-Handed Microinteractions , 2010 .

[2]  Steven K. Feiner,et al.  WeARHand: Head-worn, RGB-D camera-based, bare-hand user interface with visually enhanced depth perception , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[3]  David W. Murray,et al.  Parallel Tracking and Mapping on a camera phone , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[4]  Ramakrishnan Mukundan,et al.  3D gesture interaction for handheld augmented reality , 2014, SIGGRAPH ASIA Mobile Graphics and Interactive Applications.

[5]  Peter Fröhlich,et al.  Markerless visual fingertip detection for natural mobile device interaction , 2011, Mobile HCI.

[6]  Steven K. Feiner,et al.  A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[7]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[8]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[9]  Joe Marshall,et al.  Experiments in 3D interaction for mobile phone AR , 2007, GRAPHITE '07.

[10]  Katashi Nagao,et al.  The world through the computer: computer augmented interaction with real world environments , 1995, UIST '95.

[11]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[12]  Jong-Il Park,et al.  One-handed interaction with augmented virtual objects on mobile devices , 2008, VRCAI.

[13]  Hannes Kaufmann,et al.  3DTouch and HOMER-S: intuitive manipulation techniques for one-handed handheld augmented reality , 2013, VRIC.

[14]  Tobias Höllerer,et al.  Real-time hand interaction for augmented reality on mobile phones , 2013, IUI '13.

[15]  Dieter Schmalstieg,et al.  Pose tracking from natural features on mobile phones , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.