Exploring camera viewpoint control models for a multi-tasking setting in teleoperation

Control of camera viewpoint plays a vital role in many teleoperation activities, as watching live video streams is still the fundamental way for operators to obtain situational awareness from remote environments. Motivated by a real-world industrial setting in mining teleoperation, we explore several possible solutions to resolve a common multi-tasking situation where an operator is required to control a robot and simultaneously perform remote camera operation. Conventional control interfaces are predominantly used in such teleoperation settings, but could overload an operator's hand-operation capability, and require frequent attention switches and thus could decrease productivity. We report on an empirical user study in a model multi-tasking teleoperation setting where the user has a main task which requires their attention. We compare three different camera viewpoint control models: (1) dual manual control, (2) natural interaction (combining eye gaze and head motion) and (3) autonomous tracking. The results indicate the advantages of using the natural interaction model, while the manual control model performed the worst.

[1]  James R. Lewis,et al.  IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use , 1995, Int. J. Hum. Comput. Interact..

[2]  John Paulin Hansen,et al.  Noise tolerant selection by gaze-controlled pan and zoom in 3D , 2008, ETRA.

[3]  Gérard G. Medioni,et al.  Detecting and tracking moving objects for video surveillance , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[4]  Shumin Zhai,et al.  Conversing with the user based on eye-gaze patterns , 2005, CHI.

[5]  Andreas Paepcke,et al.  EyePoint: practical pointing and selection using gaze and keyboard , 2007, CHI.

[6]  Alessandro Valli,et al.  The design of natural interaction , 2008, Multimedia Tools and Applications.

[7]  Holly A. Yanco,et al.  Wheelesley: A Robotic Wheelchair System: Indoor Navigation and User Interface , 1998, Assistive Technology and Artificial Intelligence.

[8]  John Paulin Hansen,et al.  Gaze-controlled driving , 2009, CHI Extended Abstracts.

[9]  Jessie Y. C. Chen,et al.  Human Performance Issues and User Interface Design for Teleoperated Robots , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[10]  David W. Hainsworth,et al.  Teleoperation User Interfaces for Mining Robotics , 2001, Auton. Robots.

[11]  S Saito,et al.  Does fatigue exist in a quantitative measurement of eye movements? , 1992, Ergonomics.

[12]  Tamás D. Gedeon,et al.  Natural interaction enhanced remote camera control for teleoperation , 2010, CHI EA '10.

[13]  Tamás D. Gedeon,et al.  Keyboard before Head Tracking Depresses User Success in Remote Camera Control , 2009, INTERACT.

[14]  Terrence Fong,et al.  Vehicle Teleoperation Interfaces , 2001, Auton. Robots.

[15]  Weiwei Zhang,et al.  Face-tracking as an augmented input in video games: enhancing presence, role-playing and control , 2006, CHI.

[16]  Matt Adcock,et al.  The Development of a Telerobotic Rock Breaker , 2009, FSR.

[17]  Daniel E. Koditschek,et al.  Dynamical system representation, generation, and recognition of basic oscillatory motion gestures , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[18]  Mark Witkowski,et al.  The inspection of very large images by eye-gaze control , 2008, AVI '08.

[19]  John S. Boreczky,et al.  FlySPEC: a multi-user video camera system with hybrid human and automatic control , 2002, MULTIMEDIA '02.

[20]  C. H. Chan,et al.  Interactive PTZ Camera Control System Using Wii Remote and Infrared Sensor Bar , 2008 .

[21]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[22]  Masatoshi Ishikawa,et al.  Ptz control with head tracking for video chat , 2009, CHI Extended Abstracts.

[23]  Matt Adcock,et al.  Using collaborative haptics in remote surgical training , 2005, First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference.

[24]  Peter Xiaoping Liu,et al.  Visual gesture recognition for human-machine interface of robot teleoperation , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[25]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[26]  Chris Harrison,et al.  Lean and zoom: proximity-aware user interface and content magnification , 2008, CHI.

[27]  Michael Lewis,et al.  Robotic camera control for remote exploration , 2004, CHI '04.

[28]  Heath A. Ruff,et al.  Manual Versus Speech Input for Unmanned Aerial Vehicle Control Station Operations , 2003 .