User Defined Conceptual Modeling Gestures

Gesture and speech based interaction offers designers a powerful technique to create 3D CAD models. Previous studies on gesture based modeling have employed author defined gestures which may not be very user friendly. The aim of this study was to collect a data set of user generated gestures and accompanying voice commands for 3D modeling for form exploration in the conceptual architectural design phase. We conducted an experiment with 41 subjects to elicit their preferences in using gestures and speech for twelve 3D CAD modeling referents. In this paper we present the different types of gestures we found, and present user preferences of gestures and speech. Findings from this study will be used for the design of a speech and gesture based Cad modeling interface.

[1]  Thomas F. Stahovich,et al.  Using speech to identify gesture pen strokes in collaborative, multimodal device descriptions , 2011, Artificial Intelligence for Engineering Design, Analysis and Manufacturing.

[2]  Gilberto Osorio-Gómez,et al.  AIR-MODELLING: A tool for gesture-based solid modelling in context during early design stages in AR environments , 2015, Comput. Ind..

[3]  Elise van den Hoven,et al.  Grasping gestures: Gesturing with physical artifacts , 2011, Artificial Intelligence for Engineering Design, Analysis and Manufacturing.

[4]  GilliesMarco,et al.  Fluid gesture interaction design , 2014 .

[5]  A BoltRichard,et al.  Put-that-there , 1980 .

[6]  Frédéric Bevilacqua,et al.  Fluid gesture interaction design: Applications of continuous recognition for the design of modern gestural interfaces , 2014, TIIS.

[7]  Alessio Malizia,et al.  The artificiality of natural user interfaces , 2012, CACM.

[8]  Per Ola Kristensson,et al.  Memorability of pre-designed and user-defined gesture sets , 2013, CHI.

[9]  Jinsheng Kang,et al.  Rapid 3D conceptual design based on hand gesture , 2011, 2011 3rd International Conference on Advanced Computer Control.

[10]  Francis K. H. Quek Eyes in the interface , 1995, Image Vis. Comput..

[11]  Uday A. Athavankar,et al.  Mental Imagery as a Design Tool , 1997, Cybern. Syst..

[12]  Sungmin Cho,et al.  GaFinC: Gaze and Finger Control interface for 3D model manipulation in CAD application , 2014, Comput. Aided Des..

[13]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[14]  Rahul Rai,et al.  Hand Gesture Based Intuitive CAD Interface , 2014 .

[15]  Mark W. Salisbury,et al.  Talk and draw: bundling speech and graphics , 1990, Computer.

[16]  Dewa Wardak,et al.  Gestures orchestrating the multimodal development of ideas in educational design team meetings , 2016 .

[17]  Dennis Proffitt,et al.  Attention and visual feedback: the bimanual frame of reference , 1997, SI3D.

[18]  Sukeshini A. Grandhi,et al.  Understanding naturalness and intuitiveness in gesture production: insights for touchless gestural interfaces , 2011, CHI.

[19]  L A Thompson,et al.  Evaluation and integration of speech and pointing gestures during referential understanding. , 1986, Journal of experimental child psychology.

[20]  Sikha Varshney Castle in the air: a strategy to model shapes in a computer , 1998, Proceedings. 3rd Asia Pacific Computer Human Interaction (Cat. No.98EX110).

[21]  Thenkurussi Kesavadas,et al.  Gesture Interface for 3D CAD Modeling using Kinect , 2013 .

[22]  Zhuo Chen,et al.  Intuitive 3D Computer-Aided Design (CAD) System With Multimodal Interfaces , 2013 .

[23]  Jinsheng Kang,et al.  Instant 3D design concept generation and visualization by real-time hand gesture recognition , 2013, Comput. Ind..

[24]  S. Kicha Ganapathy,et al.  A synthetic visual environment with hand gesturing and voice input , 1989, CHI '89.

[25]  Jinsheng Kang,et al.  Generating 3D architectural models based on hand motion and gesture , 2009, Comput. Ind..

[26]  D. McNeill Hand and Mind: What Gestures Reveal about Thought , 1992 .