3D modeling is used in Computer Graphics in various fields. Since the growth of gestures, virtual reality and embodied cognition, there have been various new technologies developed to either improve the modeling efficiency, or to provide more nature intuitive experience to the users. In this paper, from the user experience perspective, we try to compare these methods for navigation of 3D objects in the virtual modeling environment including: simple bare hand gestures, tangible user interfaces (TUI) with object in hand, as well as mouse/keyboard as the primary input. Based on embodied cognition theory, we hypothesis that the object-in-hand method might bring better user experience since the interaction between the object and hand can enhance the user's cognition while navigating a model. We present a conceptual design, with two approaches and three design models which demonstrate differences in user interaction with 3D modeling software.
[1]
Eva Hornecker,et al.
A Design Theme for Tangible Interaction: Embodied Facilitation
,
2005,
ECSCW.
[2]
Karthik Ramani,et al.
zPots: a virtual pottery experience with spatial interactions using the leap motion device
,
2014,
CHI Extended Abstracts.
[4]
Georgia Albuquerque,et al.
Tangible 3D: hand gesture interaction for immersive 3D modeling
,
2005,
EGVE'05.
[5]
Dieter Fox,et al.
RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments
,
2012,
Int. J. Robotics Res..
[6]
Paul A. Beardsley,et al.
Tangible interaction + graphical interpretation: a new approach to 3D modeling
,
2000,
SIGGRAPH.