Design review of CAD assemblies using bimanual natural interface

The interaction metaphor, based on mouse, monitor and keyboard, presents evident limits in the engineering design review activities, when real and virtual models must be explored and compared, and also in “outside-the-office” environments, where the desk is not available. The presented research aims to explore a new generation of gesture-based interfaces, called “natural interfaces”, which promise an intuitive control using free hands and without the desk support. We present a novel natural design review workspace which acquires user motion using a combination of video and depth cameras and visualizes the CAD models using monitor-based augmented reality. We implemented a bimanual egocentric pointer paradigm by a virtual active surface in front of the user. We used a XML configurable approach to explore bimanual gesture commands to browse, select, dis/assembly and explode 3D complex models imported in standard STEP format. Our experiments demonstrated that the virtual active surface is able to effectively trigger a set of CAD specific commands and to improve technical navigation in non-desktop environments: e.g. shop floor maintenance, on site quality control, etc. We evaluated the feasibility and robustness of the interface and reported a high degree of acceptance from the users who preferred the presented interface to a unconstrained 3D manipulation.

[1]  Pier Paolo Valentini Enhancing User Role in Augmented Reality Interactive Simulations , 2013 .

[2]  Hirokazu Kato,et al.  Marker tracking and HMD calibration for a video-based augmented reality conferencing system , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[3]  Mark Billinghurst,et al.  Applying HCI principles to AR systems design , 2007 .

[4]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[5]  Randy F. Pausch,et al.  Navigation and locomotion in virtual worlds via flight into hand-held miniatures , 1995, SIGGRAPH.

[6]  Rafael Radkowski,et al.  Interactive Hand Gesture-based Assembly for Augmented Reality Applications , 2012, ACHI 2012.

[7]  Andy Cockburn,et al.  FingARtips: gesture based direct manipulation in Augmented Reality , 2004, GRAPHITE '04.

[8]  Gideon Steinberg Natural User Interfaces , 2012 .

[9]  David J. Kriegman,et al.  A Real-Time Approach to the Spotting, Representation, and Recognition of Hand Gestures for Human-Computer Interaction , 2002, Comput. Vis. Image Underst..

[10]  Hiroshi Ishii,et al.  Emerging frameworks for tangible user interfaces , 2000, IBM Syst. J..

[11]  Rajit Gadh,et al.  Creation of concept shape designs via a virtual reality interface , 1997, Comput. Aided Des..

[12]  Xia Liu,et al.  Hand gesture recognition using depth data , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[13]  Pier Paolo Valentini,et al.  Natural interface in augmented reality interactive simulations , 2012 .

[14]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[15]  M. Fiorentino,et al.  Tangible digital master for product lifecycle management in augmented reality , 2009 .

[16]  Rafael Radkowski,et al.  Augmented Technical Drawings: A Novel Technique for Natural Interactive Visualization of Computer-Aided Design Models , 2012, J. Comput. Inf. Sci. Eng..

[17]  Gerhard Rigoll,et al.  Static and Dynamic Hand-Gesture Recognition for Augmented Reality Applications , 2007, HCI.

[18]  Massimiliano Dellisanti Fabiano,et al.  Enhanced 3D object snap for CAD modelling on large stereo displays , 2008, Int. J. Comput. Appl. Technol..

[19]  John Seely Brown,et al.  The coming age of calm technolgy , 1997 .