Integrating Gyroscopes into Ubiquitous Tracking Environments

It is widely recognized that inertial sensors, in particular gyroscopes, can improve the latency and accuracy of orientation tracking by fusing the inertial measurements with data from other sensors. In our previous work, we introduced the concepts of spatial relationship graphs and spatial relationship patterns to formally model multi-sensor tracking setups and derive valid applications of well-known algorithms in order to infer new spatial relationships for tracking and calibration. In this work, we extend our approach by providing additional spatial relationship patterns that transform incremental rotations and add gyroscope alignment and fusion. The usefulness of the resulting tracking configurations is evaluated in two different scenarios with both inside-out and outside-in tracking.

[1]  Marie-Odile Berger,et al.  Handling uncertain sensor data in vision-based camera tracking , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[2]  Gudrun Klinker,et al.  Spatial relationship patterns: elements of reusable tracking and calibration systems , 2006, 2006 IEEE/ACM International Symposium on Mixed and Augmented Reality.

[3]  Ronald Azuma,et al.  Improving static and dynamic registration in an optical see-through HMD , 1994, SIGGRAPH.

[4]  Dieter Schmalstieg,et al.  Ubiquitous tracking for augmented reality , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[5]  Gudrun Klinker,et al.  Hybrid Information Presentation: Combining a Portable Augmented Reality Laser Projector and a Conventional Computer Display , 2007, EGVE.

[6]  Roger Y. Tsai,et al.  Real time versatile robotics hand/eye calibration using 3D machine vision , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.