A pivotable head mounted camera system that is aligned by three-dimensional eye movements

The first proof of concept of an eye movement driven head camera system was recently presented. This innovative device utilized voluntary and reflexive eye movements, which were registered by video-oculography and computed online, as signals to drive servo motors which then aligned the camera along the user's gaze direction. However, with just two degrees of freedom, this camera motion device could not compensate for roll motions around the optical axis of the system. Therefore a new three-degree-of-freedom camera motion device that is able to reproduce the whole range of possible eye movements has now been implemented. In doing so, it allows a freely mobile user to aim the optical axis of the head mounted camera system at the target(s) in the visual field at which he/she is looking, while the ocular reflexes minimize image shaking by naturally counter-rolling the "gaze in space" of the camera during head and visual scene movements as well as during locomotion. A camera guided in this way mimics the natural exploration of a visual scene and acquires video sequences from the perspective of a mobile user, while the oculomotor reflexes naturally stabilize the camera on target during head and target movements. Various documentation and teaching applications in health care, industry, and research are conceivable. This work presents the implementation of the new camera motion device and its, integration into a head camera setup including the eye tracking device.

[1]  David J. Field,et al.  How Close Are We to Understanding V1? , 2005, Neural Computation.

[2]  Friedrich Pfeiffer Entwurf und Realisierung einer zweibeinigen Laufmaschine , 2005 .

[3]  D Tweed,et al.  Optimizing gaze control in three dimensions. , 1998, Science.

[4]  J. Pelz,et al.  Oculomotor behavior and perceptual strategies in complex tasks , 2001, Vision Research.

[5]  Joan N. Vickers,et al.  How far ahead do we look when required to step on specific locations in the travel path during locomotion? , 2002, Experimental Brain Research.

[6]  Friedrich Pfeiffer,et al.  Towards the design of a biped jogging robot , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[7]  Alexander Zelinsky,et al.  A Novel Mechanism for Stereo Active Vision , 2000 .

[8]  Klaus Bartl,et al.  Mobile eye tracking as a basis for real-time control of a gaze driven head-mounted video camera , 2006, ETRA '06.

[9]  Klaus Bartl,et al.  Eye movement driven head-mounted camera: it looks where the eyes look , 2005, 2005 IEEE International Conference on Systems, Man and Cybernetics.

[10]  David W. Murray,et al.  Wearable visual robots , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[11]  Clément Gosselin,et al.  The agile eye: a high-performance three-degree-of-freedom camera-orienting device , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[12]  Brian Scassellati A Binocular, Foveated Active Vision System , 1998 .

[13]  R. Leigh,et al.  The neurology of eye movements , 2006 .

[14]  David W. Murray,et al.  Wearable Visual Robots , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[15]  N. Shimizu [Neurology of eye movements]. , 2000, Rinsho shinkeigaku = Clinical neurology.

[16]  H. Ulbrich,et al.  Modelling and online computation of the dynamics of a parallel kinematic with six degrees-of-freedom , 2003 .

[17]  Konrad P. Körding,et al.  The world from a cat’s perspective – statistics of natural videos , 2003, Biological Cybernetics.