Bimanual marking menu for near surface interactions

We describe a mouseless, near-surface version of the Bimanual Marking Menu system. To activate the menu system, users create a pinch gesture with either their index or middle finger to initiate a left click or right click. Then they mark in the 3D space near the interactive area. We demonstrate how the system can be implemented using a commodity range camera such as the Microsoft Kinect, and report on several designs of the 3D marking system. Like the multi-touch marking menu, our system offers a large number of accessible commands. Since it does not rely on contact points to operate, our system leaves the non-dominant hand available for other multi-touch interactions.

[1]  Lars Bretzner,et al.  Using marking menus to develop command sets for computer vision based hand gesture interfaces , 2002, NordiCHI '02.

[2]  Paul K. Wright,et al.  Toolglasses, Marking Menus, and Hotkeys: A Comparison of One and Two-Handed Command Selection Techniques , 2004, Graphics Interface.

[3]  Feng Wang,et al.  Empirical evaluation for finger input properties in multi-touch interaction , 2009, CHI.

[4]  Thomas Hofmann,et al.  Support vector machine learning for interdependent and structured output spaces , 2004, ICML.

[5]  Tovi Grossman,et al.  The design and evaluation of multitouch marking menus , 2010, CHI.

[6]  R. Vijayanandh,et al.  Human face detection using color spaces and region property measures , 2010, 2010 11th International Conference on Control Automation Robotics & Vision.

[7]  Terry Winograd,et al.  FlowMenu: combining command, text, and data entry , 2000, UIST '00.

[8]  François Guimbretière,et al.  Relative role of merging and two-handed operation on command selection speed , 2008, Int. J. Hum. Comput. Stud..

[9]  Elisabeth André,et al.  Hand distinction for multi-touch tabletop interaction , 2009, ITS '09.

[10]  Emmanuel Barillot,et al.  Control menus: excecution and control in a single interactor , 2000, CHI Extended Abstracts.

[11]  Paul Richard,et al.  Evaluation of the Command and Control Cube , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[12]  Gordon Kurtenbach,et al.  The design and evaluation of marking menus , 1993 .

[13]  Xiang Cao,et al.  Detecting and leveraging finger orientation for interaction with direct-touch surfaces , 2009, UIST '09.

[14]  Andrew D. Wilson Robust computer vision-based detection of pinching for one and two-handed gesture input , 2006, UIST.

[15]  Ravin Balakrishnan,et al.  Simple vs. compound mark hierarchical marking menus , 2004, UIST '04.

[16]  Tovi Grossman,et al.  The design and evaluation of multi-finger mouse emulation techniques , 2009, CHI.

[17]  Tovi Grossman,et al.  Collaborative interaction with volumetric displays , 2008, CHI.

[18]  Sean Gustafson,et al.  PinchWatch: A Wearable Device for One-Handed Microinteractions , 2010 .

[19]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[20]  智一 吉田,et al.  Efficient Graph-Based Image Segmentationを用いた圃場図自動作成手法の検討 , 2014 .

[21]  Hrvoje Benko,et al.  Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface , 2008 .

[22]  Doug A. Bowman,et al.  Tech-note: rapMenu: Remote Menu Selection Using Freehand Gestural Input , 2008, 2008 IEEE Symposium on 3D User Interfaces.

[23]  Patrick Baudisch,et al.  Hover widgets: using the tracking state to extend the capabilities of pen-operated devices , 2006, CHI.

[24]  Jörg Müller,et al.  Comparing Free Hand Menu Techniques for Distant Displays Using Linear, Marking and Finger-Count Menus , 2011, INTERACT.