InvisiShapes: A Recognition System for Sketched 3D Primitives in Continuous Interaction Spaces

Continued improvements and rising ubiquity in touchscreen and motion-sensing technologies enable users to leverage mid-air input modalities for intelligent surface sketching into the third dimension. However, existing approaches largely either focus on constrained 3D gesture sets, require specialized hardware setups, or do not deviate beyond surface sketching assumptions. We present InvisiShapes, a recognition system for users to sketch 3D geometric primitives in continuous interaction spaces that explore surfaces and mid-air environments. Our system leverages a collection of sketch and gesture recognition techniques and heuristics and takes advantage of easily accessible computing hardware for users to incorporate depth to their sketches. From our interaction study and user evaluations, we observed that our system successfully accomplishes strong recognition and intuitive interaction capabilities on collected sketch+motion data and interactive sketching scenarios, respectively.

[1]  Joseph J. LaViola,et al.  A ShortStraw-based algorithm for corner finding in sketch-based interfaces , 2010, Comput. Graph..

[2]  Joaquim A. Jorge,et al.  Mockup builder: direct 3D modeling on and above the surface in a continuous interaction space , 2012, Graphics Interface.

[3]  Patrick Baudisch,et al.  Imaginary devices: gesture-based interaction mimicking traditional input devices , 2013, MobileHCI '13.

[4]  Peter van Sommers,et al.  Drawing and Cognition: Descriptive and Experimental Studies of Graphic Production Processes , 1984 .

[5]  Peter van Sommers,et al.  Drawing and cognition: Frontmatter , 1984 .

[6]  Lisa Anthony,et al.  $N-protractor: a fast and accurate multistroke recognizer , 2012, Graphics Interface.

[7]  Michael Rohs,et al.  Protractor3D: a closed-form solution to rotation-invariant 3D gestures , 2011, IUI '11.

[8]  Randall Davis,et al.  LADDER, a sketching language for user interface developers , 2005, Comput. Graph..

[9]  Michael Rohs,et al.  A $3 gesture recognizer: simple gesture recognition for devices equipped with 3D acceleration sensors , 2010, IUI '10.

[10]  Saul Greenberg,et al.  The Continuous Interaction Space: Interaction Techniques Unifying Touch and Gesture on and above a Digital Surface , 2011, INTERACT.

[11]  Tracy Anne Hammond,et al.  Developing sketch recognition and interaction techniques for intelligent surfaceless sketching user interfaces , 2014, IUI Companion '14.

[12]  Raimund Dachselt,et al.  Going beyond the surface: studying multi-layer interaction above the tabletop , 2012, CHI.

[13]  Joseph J. LaViola,et al.  Breaking the status quo: Improving 3D gesture recognition with spatially convenient input devices , 2010, 2010 IEEE Virtual Reality Conference (VR).

[14]  Michael Rohs,et al.  Combining acceleration and gyroscope data for motion gesture recognition using classifiers with dimensionality constraints , 2013, IUI '13.

[15]  Tracy Anne Hammond,et al.  PaleoSketch: accurate primitive sketch recognition and beautification , 2008, IUI '08.

[16]  Sumit Gulwani,et al.  QuickDraw: improving drawing experience for geometric diagrams , 2012, CHI.

[17]  Brian Eoff,et al.  ShortStraw: a simple and effective corner finder for polylines , 2008, SBM'08.

[18]  Sriram Subramanian,et al.  Multi-layer interaction for digital tables , 2006, UIST.

[19]  Tracy Anne Hammond,et al.  Initial approaches for extending sketch recognition to beyond-surface environments , 2012, CHI Extended Abstracts.