Evaluation of auditory feedback on task performance in virtual assembly environment

This paper presents our approach for the integration of auditory feedback into virtual assembly environment (VAE), and the investigation of the effect of auditory and visual feedback on the assembly task performance in virtual environments (VE). This VE experimental system platform brought together complex technologies such as constraint-based assembly simulation, optical motion tracking technology, and real-time 3D sound generation technology around a virtual reality (VR) workbench and a common software platform. Several experiments have been conducted to explore and evaluate the effectiveness of neutral, visual, auditory and integrated feedback mechanisms on task performance in the context of assembly simulation in VEs. Peg-in-a-hole assembly task has been used as the task case to perform the experiments, using sixteen subjects. Both objective performance data (task completion time) and subjective opinions (questionnaires) on the utilisation of auditory and visual feedback in VAE were gained from the experiments. The results showed that the addition of auditory feedback did introduce an improvement in the virtual assembly task performance. They also indicated that the statistically significant effect of the combination of auditory and visual feedback on the assembly task performance was better than individual feedback mechanism used alone. Most of the users preferred the combined feedbacks to the individual ones. The subjects' comments demonstrated that non-realistic feedback would defer performance and increase the level of frustration.

[1]  Elizabeth M. Wenzel,et al.  Localization in Virtual Acoustic Displays , 1992, Presence: Teleoperators & Virtual Environments.

[2]  Yong Wang,et al.  VADE: A Virtual Assembly Design Environment , 1999, IEEE Computer Graphics and Applications.

[3]  Durand R. Begault,et al.  3-D Sound for Virtual Reality and Multimedia Cambridge , 1994 .

[4]  James K. Hahn,et al.  Integrating Sounds and Motions in Virtual Environments , 1998, Presence.

[5]  Constantine Stephanidis,et al.  Human-Computer Interaction : Theory and Practice (part 1), Volume 1 , 2005 .

[6]  Yoshifumi Kitamura,et al.  A Sophisticated Manipulation Aid in a Virtual Environment using Dynamic Constraints among Object Faces , 1998, Presence.

[7]  Ying Zhang,et al.  3D auditory feedback acts as task aid in a virtual assembly environment , 2003, Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation. Computational Intelligence in Robotics and Automation for the New Millennium (Cat. No.03EX694).

[8]  Elizabeth M. Wenzel,et al.  Localization with non-individualized virtual acoustic display cues , 1991, CHI.

[9]  Torsten Kuhlen,et al.  MAESTRO - a tool for interactive assembly simulation in virtual environments , 2001, EGVE/IPT.

[10]  Ying Zhang,et al.  Integration of 3D Sound Feedback into a Virtual Assembly Environment , 2003 .

[11]  James F. O'Brien,et al.  Synthesizing Sounds from Physically Based Motion , 2001, SIGGRAPH Video Review on Animation Theater Program.

[12]  Fan Dai,et al.  Virtual Reality for Industrial Applications , 1998, Computer Graphics: Systems and Applications.

[13]  Dinesh K. Pai,et al.  FoleyAutomatic: physically-based sound effects for interactive simulation and animation , 2001, SIGGRAPH.

[14]  Dinesh K. Pai,et al.  Physically-based Sound Eects for Interactive Simulation and Animation , 2001 .