GesCAD: an intuitive interface for conceptual architectural design

Gesture- and speech-based 3D modeling offers designers a powerful and intuitive way to create 3D Computer Aided Design (CAD) models. Instead of arbitrary gestures and speech commands defined by researchers, which may not be intuitive for users, such a natural user interface should be based on gesture and command set elicited from the users. We describe our ongoing research on a speech-and-gesture-based CAD modeling interface, GesCAD, implemented by combining Microsoft Kinect and Rhino, a leading CAD software. GesCAD is based on gestures and speech commands elicited from a specially designed user experiment. We conducted a preliminary user study with 6 participants to evaluate the user experience of our prototype, such as ease of use, physical comfort and satisfaction with the models created. Results show that participants found the overall experience of using GesCAD fun and the speech and gesture commands easy to remember.

[1]  Gilberto Osorio-Gómez,et al.  AIR-MODELLING: A tool for gesture-based solid modelling in context during early design stages in AR environments , 2015, Comput. Ind..

[2]  Bige Tunçer,et al.  Intuitive and Effective Gestures for Conceptual Architectural Design: An Analysis Of User Elicited Hand Gestures For 3D CAD Modeling , 2017 .

[3]  Ying Yin,et al.  Real-time continuous gesture recognition for natural human-computer interaction , 2014, 2014 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC).

[4]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[5]  Jinsheng Kang,et al.  Instant 3D design concept generation and visualization by real-time hand gesture recognition , 2013, Comput. Ind..

[6]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[7]  Suranga Nanayakkara,et al.  zSense: Enabling Shallow Depth Gesture Recognition for Greater Input Expressivity on Smart Wearables , 2015, CHI.

[8]  Rahul Rai,et al.  Hand Gesture Based Intuitive CAD Interface , 2014 .

[9]  Frédéric Bevilacqua,et al.  Fluid gesture interaction design: Applications of continuous recognition for the design of modern gestural interfaces , 2014, TIIS.

[10]  N. Cross Designerly ways of knowing , 2006 .

[11]  Sebastian Möller,et al.  I'm home: Defining and evaluating a gesture set for smart-home control , 2011, Int. J. Hum. Comput. Stud..

[12]  Holger Regenbrecht,et al.  MagicMeeting: A Collaborative Tangible Augmented Reality System , 2002, Virtual Reality.

[13]  J. Cassell Computer Vision for Human–Machine Interaction: A Framework for Gesture Generation and Interpretation , 1998 .

[14]  A. Pentland,et al.  Computer Vision for Human–Machine Interaction: A Framework for Gesture Generation and Interpretation , 1998 .

[15]  Zhuo Chen,et al.  Intuitive 3D Computer-Aided Design (CAD) System With Multimodal Interfaces , 2013 .

[16]  Alessio Malizia,et al.  The artificiality of natural user interfaces , 2012, CACM.

[17]  Sharon L. Oviatt,et al.  Ten myths of multimodal interaction , 1999, Commun. ACM.

[18]  Manuel Contero,et al.  Sketch-Based Interfaces for Parametric Modelling , 2008 .

[19]  Wolfgang Hürst,et al.  Gesture-based interaction via finger tracking for mobile augmented reality , 2011, Multimedia Tools and Applications.

[20]  Bige Tunçer,et al.  User Defined Conceptual Modeling Gestures , 2018 .

[21]  David Lindlbauer,et al.  Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI , 2012 .

[22]  Thenkurussi Kesavadas,et al.  Gesture Interface for 3D CAD Modeling using Kinect , 2013 .