Developing and analyzing intuitive modes for interactive object modeling

In this paper we present two approaches for intuitive interactive modelling of special object attributes by use of specific sensoric hardware. After a brief overview over the state of the art in interactive, intuitive object modeling, we motivate the modeling task by deriving the dierent object attributes that shall be modeled from an analysis of important interactions with objects. As an example domain, we chose the setting of a service robot in a kitchen. Tasks from this domain were used to derive important basic actions from which in turn the necessary object attributes were inferred. In the main section of the paper, two of the derived attributes are presented, each with an intuitive interactive modeling method. The object attributes to be modeled a restable object positions and movement restrictions for objects. Both of the intuitive interaction methods were evaluated with a group of test persons and the results are discussed. The paper ends with conclusions on the discussed results and a preview of future work in this area, in particular of potential applications.

[1]  Luis Serra,et al.  Dextrous virtual work , 1996, CACM.

[2]  Vincent Hayward,et al.  The pantograph: a large workspace haptic device for multimodal human computer interaction , 1994, CHI '94.

[3]  Ivan Poupyrev,et al.  The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.

[4]  Martin White,et al.  Usability evaluation of the EPOCH multimodal user interface: designing 3D tangible interactions , 2006, VRST '06.

[5]  Andreas Simon,et al.  NOYO: 6DOF elastic rate control for virtual environments , 2004, VRST '04.

[6]  Takeo Kanade,et al.  DigitEyes: vision-based hand tracking for human-computer interaction , 1994, Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects.

[7]  James F. Allen,et al.  Toward Conversational Human-Computer Interaction , 2001, AI Mag..

[8]  Kouichi Murakami,et al.  Gesture recognition using recurrent neural networks , 1991, CHI.

[9]  François Bérard,et al.  Bare-hand human-computer interaction , 2001, PUI '01.

[10]  Thomas H. Massie,et al.  The PHANToM Haptic Interface: A Device for Probing Virtual Objects , 1994 .

[11]  James F. Allen,et al.  Towards Conversational Human-Computer Interaction , 2000 .

[12]  Worthy N. Martin,et al.  Human-computer interaction using eye-gaze input , 1989, IEEE Trans. Syst. Man Cybern..

[13]  Torsten Kuhlen,et al.  Design of Multimodal Feedback Mechanisms for Interactive 3D Object Manipulation , 1999, HCI.

[14]  Ben Shneiderman,et al.  The limits of speech recognition , 2000, CACM.

[15]  Brad A. Myers,et al.  A brief history of human-computer interaction technology , 1998, INTR.