Tasks carried out remotely via a telerobotic system are typically complex, occur in hazardous environments and require fine control of the robot's movements. Telepresence systems provide the teleoperator with a feeling of being physically present at the remote site. Stereoscopic video has been successfully applied to telepresence vision systems to increase the operator's perception of depth in the remote scene and this sense of presence can be further enhanced using computer generated stereo graphics to augment the visual information presented to the operator. The Mechatronic Systems and Robotics Research Group have over seven years developed a number of high performance active stereo vision systems culminating in the latest, a four degree-of-freedom stereohead. This carries two miniature color cameras and is controlled in real time by the motion of the operator's head, who views the stereoscopic video images on an immersive head mounted display or stereo monitor. The stereohead is mounted on a mobile robot, the movement of which is controlled by a joystick interface. This paper describes the active telepresence system and the development of a prototype augmented reality (AR) application to enhance the operator's sense of presence at the remote site. The initial enhancements are a virtual map and compass to aid navigation in degraded visual conditions and a virtual cursor that provides a means for the operator to interact with the remote environment. The results of preliminary experiments using the initial enhancements are presented.
[1]
Paul Milgram,et al.
Positioning accuracy of a virtual stereographic pointer in a real stereoscopic video world
,
1991,
Electronic Imaging.
[2]
Ronald Azuma,et al.
Improving static and dynamic registration in an optical see-through HMD
,
1994,
SIGGRAPH.
[3]
David E. Breen,et al.
Annotating Real-World Objects Using Augmented Reality
,
1995,
Computer Graphics.
[4]
Thomas B. Sheridan.
Teleoperation, Telerobotics, and Telepresence: A Progress Report
,
1992
.
[5]
Ryutarou Ohbuchi,et al.
Merging virtual objects with the real world: seeing ultrasound imagery within the patient
,
1992,
SIGGRAPH.