Exploring interface with representation of gesture for remote collaboration

This paper reports on a laboratory study into the gesture representation interface for remote collaboration on physical tasks. Measured by task performance and user's perception of interaction, the experiment assessed two gesture representations (hands vs. cursor pointer) in the context of a video mediated interface which included a view of the remote partner. We did not find any significant difference between the hands condition and the pointer condition when measuring user's task performance. However, our result showed that participants reported an overall preference of using the pointer functionality than using the hands'. We found that participants perceived higher quality of interaction in the hands condition than in the pointer condition and there was a significant difference. Additionally, majority of the participants valued the ability of being able to see each other's face during the collaboration. We conclude with a discussion on the importance of accounting for the user's perception of interaction in addition to the traditional task performance measure in evaluating gesture representation interface, and the importance of considering these two factors in recommending the most suitable interface design with gesture representation for collaboration on physical tasks.

[1]  Xilin Chen,et al.  Gestural communication over video stream: supporting multimodal interaction for remote collaborative physical tasks , 2003, ICMI '03.

[2]  Susan R. Fussell,et al.  Analyzing and predicting focus of attention in remote collaborative tasks , 2005, ICMI '05.

[3]  Hideaki Kuzuoka,et al.  WACL: supporting telecommunications using - wearable active camera with laser pointer , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[4]  David S. Kirk,et al.  Comparing remote gesture technologies for supporting collaborative physical tasks , 2006, CHI.

[5]  David S. Kirk,et al.  The effects of remote gesturing on distance instruction , 2005, CSCL.

[6]  Jane Li,et al.  Design of an advanced telemedicine system for emergency care , 2006, Australasian Computer-Human Interaction Conference.

[7]  Hideaki Kuzuoka,et al.  Mediating dual ecologies , 2004, CSCW.

[8]  Hideaki Kuzuoka,et al.  Remote support for emergency medicine using a remote-control laser pointer , 2006, Journal of telemedicine and telecare.

[9]  Andrew F. Monk,et al.  Remote assistance: a view of the work and a view of the face? , 1996, CHI Conference Companion.

[10]  Susan R. Fussell,et al.  Assessing the value of a cursor pointing device for remote collaboration on physical tasks , 2003, CHI Extended Abstracts.

[11]  Abigail Sellen,et al.  One is not enough: multiple views in a media space , 1993, INTERCHI.

[12]  Robert E. Kraut,et al.  Visual Information as a Conversational Resource in Collaborative Physical Tasks , 2003, Hum. Comput. Interact..

[13]  Tom Rodden,et al.  Turn it this way: grounding collaborative action with remote gestures , 2007, CHI.

[14]  Susan R. Fussell,et al.  Gestures Over Video Streams to Support Remote Collaboration on Physical Tasks , 2004, Hum. Comput. Interact..