NATURAL AND INTUITIVE GESTURE INTERACTION FOR 3D OBJECT MANIPULATION IN CONCEPTUAL DESIGN

Gesture interaction with three-dimensional (3D) representations is increasingly explored, however there is little research present on the nature of the gestures used. A study was conducted in order to explore gestures designers perform naturally and intuitively while interacting with 3D objects during conceptual design. The findings demonstrate that different designers perform similar gestures for the same activities, and that their interaction with a 3D representation on a 2D screen is consistent with that which would be expected if a physical object were suspended in air in front of them.

[1]  Andy Cockburn,et al.  FingARtips: gesture based direct manipulation in Augmented Reality , 2004, GRAPHITE '04.

[2]  Rahul Rai,et al.  Human factors study on the usage of BCI headset for 3D CAD modeling , 2014, Comput. Aided Des..

[3]  Gilberto Osorio-Gómez,et al.  AIR-MODELLING: A tool for gesture-based solid modelling in context during early design stages in AR environments , 2015, Comput. Ind..

[4]  Weihang Zhu,et al.  A Methodology for Building Up an Infrastructure of Haptically Enhanced Computer-Aided Design Systems , 2008, J. Comput. Inf. Sci. Eng..

[5]  Albrecht Schmidt Following or leading? , 2015, Interactions.

[6]  Philip Cash,et al.  Prototyping with your hands: the many roles of gesture in the communication of design concepts , 2016 .

[7]  Zhan Gao,et al.  Haptic sculpting of multi-resolution B-spline surfaces with shaped tools , 2006, Comput. Aided Des..

[8]  D. McNeill So you think gestures are nonverbal , 1985 .

[9]  Ahmed K. Noor,et al.  Potential of multimodal and multiuser interaction with virtual holography , 2015, Adv. Eng. Softw..

[10]  Darren Cosker,et al.  3D Gesture Recognition: An Evaluation of User and System Performance , 2011, Pervasive.

[11]  Hyotaek Lim,et al.  Hand tracking and gesture recognition system for human-computer interaction using low-cost hardware , 2015, Multimedia Tools and Applications.

[12]  Karthik Ramani,et al.  A gesture-free geometric approach for mid-air expression of design intent in 3D virtual pottery , 2015, Comput. Aided Des..

[13]  Alex H. B. Duffy,et al.  A systematic review of protocol studies on conceptual design cognition , 2017 .

[14]  Jinah Park,et al.  Study on interaction-induced symptoms with respect to virtual grasping and manipulation , 2014, Int. J. Hum. Comput. Stud..

[15]  E. Mohammadi,et al.  Barriers and facilitators related to the implementation of a physiological track and trigger system: A systematic review of the qualitative evidence , 2017, International journal for quality in health care : journal of the International Society for Quality in Health Care.

[16]  Ben Horan,et al.  Taking the LEAP with the Oculus HMD and CAD - Plucking at thin Air? , 2015 .

[17]  Jinsheng Kang,et al.  Use of Three-Dimensional Body Motion to Free-Form Surface Design , 2006 .

[18]  Mehdi Ammi,et al.  VR-CAD integration: Multimodal immersive interaction and advanced haptic paradigms for implicit edition of CAD models , 2010, Comput. Aided Des..

[19]  Yueyang Li,et al.  Research on computer aided creative design , 2009, 2009 IEEE 10th International Conference on Computer-Aided Industrial Design & Conceptual Design.

[20]  Byeong-jun Han,et al.  Virtual pottery: a virtual 3D audiovisual interface using natural hand motions , 2013, Multimedia Tools and Applications.

[21]  Wolfgang Hürst,et al.  Gesture-based interaction via finger tracking for mobile augmented reality , 2011, Multimedia Tools and Applications.

[22]  Tom G. Zimmerman,et al.  A hand gesture interface device , 1987, CHI '87.

[23]  Bongshin Lee,et al.  Reducing legacy bias in gesture elicitation studies , 2014, INTR.

[24]  Sungmin Cho,et al.  GaFinC: Gaze and Finger Control interface for 3D model manipulation in CAD application , 2014, Comput. Aided Des..

[25]  Rajit Gadh,et al.  Creation of concept shape designs via a virtual reality interface , 1997, Comput. Aided Des..

[26]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[27]  Brian A. Nosek,et al.  Power failure: why small sample size undermines the reliability of neuroscience , 2013, Nature Reviews Neuroscience.

[28]  Francis K. H. Quek The Catchment Feature Model: A Device for Multimodal Fusion and a Bridge between Signal and Sense , 2004, EURASIP J. Adv. Signal Process..

[29]  Patrick Langdon,et al.  Physical gestures for abstract concepts: Inclusive design with primary metaphors , 2010, Interact. Comput..

[30]  Meredith Ringel Morris,et al.  Understanding users' preferences for surface gestures , 2010, Graphics Interface.

[31]  Lars Kulik,et al.  Gesture recognition using RFID technology , 2012, Personal and Ubiquitous Computing.