GaFinC: Gaze and Finger Control interface for 3D model manipulation in CAD application

Natural and intuitive interfaces for CAD modeling such as hand gesture controls have received a lot of attention recently. However, in spite of its high intuitiveness and familiarity, their use for actual applications has been found to be less comfortable than a conventional mouse interface because of user physical fatigue over long periods of operation. In this paper, we propose an improved gesture control interface for 3D modeling manipulation tasks that possesses conventional interface level usability with low user fatigue while maintaining a high level of intuitiveness. By analyzing problems associated with previous hand gesture controls in translation, rotation and zooming, we developed a multi-modal control interface GaFinC: Gaze and Finger Control interface. GaFinC can track precise hand positions, recognizes several finger gestures, and utilizes an independent gaze pointing interface for setting the point of interest. To verify the performance of GaFinC, tests of manipulation accuracy and time are conducted and their results are compared with those of a conventional mouse. The comfort and intuitiveness level are also scored by means of user interviews. As a result, although the GaFinC interface posted insufficient performance in accuracy and times compared with a mouse, it shows applicable level performance. Also users found it to be more intuitive than a mouse interface while maintaining a usable level of comfort.

[1]  Hua Li,et al.  Vision based gesture recognition system with single camera , 2000, WCC 2000 - ICSP 2000. 2000 5th International Conference on Signal Processing Proceedings. 16th World Computer Congress 2000.

[2]  Jihun Cha,et al.  Multi-modal user interaction method based on gaze tracking and gesture recognition , 2013, Signal Process. Image Commun..

[3]  Rafael Radkowski,et al.  Design review of CAD assemblies using bimanual natural interface , 2013 .

[4]  Jens Schneider,et al.  Investigating Freehand Pan and Zoom , 2012, Mensch & Computer.

[5]  Leena Arhippainen,et al.  Gaze tracking and non-touch gesture based interaction method for mobile 3D virtual spaces , 2012, OZCHI.

[6]  Kunwoo Lee,et al.  KineCAD: A Novel Gesture-Based CAD System Using Kinect , 2012 .

[7]  Rafael Radkowski,et al.  Augmented Technical Drawings: A Novel Technique for Natural Interactive Visualization of Computer-Aided Design Models , 2012, J. Comput. Inf. Sci. Eng..

[8]  Karthik Ramani,et al.  Shape-It-Up: Hand gesture based creative expression of 3D shapes using intelligent generalized cylinders , 2013, Comput. Aided Des..

[9]  Dhiraj K. Pradhan,et al.  Matching in memristor based auto-associative memory with application to pattern recognition , 2014, 2014 12th International Conference on Signal Processing (ICSP).

[10]  Frederick P. Brooks,et al.  Moving objects in space: exploiting proprioception in virtual-environment interaction , 1997, SIGGRAPH.

[11]  Yael Edan,et al.  Vision-based hand-gesture applications , 2011, Commun. ACM.

[12]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[13]  Richard A. Bolt,et al.  Two-handed gesture in multi-modal natural dialog , 1992, UIST '92.

[14]  Piyush Kumar,et al.  Hand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction , 2012 .

[15]  Sungmin Cho,et al.  Turn: a virtual pottery by real spinning wheel , 2012, SIGGRAPH '12.

[16]  Thenkurussi Kesavadas,et al.  Gesture Interface for 3D CAD Modeling using Kinect , 2013 .

[17]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.