My thoughts through a robot's eyes: An augmented reality-brain–machine interface

A brain-machine interface (BMI) uses neurophysiological signals from the brain to control external devices, such as robot arms or computer cursors. Combining augmented reality with a BMI, we show that the user's brain signals successfully controlled an agent robot and operated devices in the robot's environment. The user's thoughts became reality through the robot's eyes, enabling the augmentation of real environments outside the anatomy of the human body.

[1]  Jonathan R Wolpaw,et al.  Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans. , 2004, Proceedings of the National Academy of Sciences of the United States of America.

[2]  L. Cohen,et al.  Brain–computer interfaces: communication and restoration of movement in paralysis , 2007, The Journal of physiology.

[3]  Cuntai Guan,et al.  Unsupervised brain computer interface based on inter-subject information , 2008, 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[4]  T. Komatsu A non-training EEG-based BMI system for environmental control , 2008 .

[5]  Hirokazu Kato,et al.  Marker tracking and HMD calibration for a video-based augmented reality conferencing system , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[6]  Xiaorong Gao,et al.  Design and implementation of a brain-computer interface with high transfer rates , 2002, IEEE Transactions on Biomedical Engineering.

[7]  T. Metzinger,et al.  Video Ergo Sum: Manipulating Bodily Self-Consciousness , 2007, Science.

[8]  T.M. Vaughan,et al.  Common Spatio-Temporal Patterns for the P300 Speller , 2007, 2007 3rd International IEEE/EMBS Conference on Neural Engineering.

[9]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[10]  Miguel A. L. Nicolelis,et al.  Brain–machine interfaces: past, present and future , 2006, Trends in Neurosciences.

[11]  Y. Nakajima,et al.  Visual stimuli for the P300 brain–computer interface: A comparison of white/gray and green/blue flicker matrices , 2009, Clinical Neurophysiology.

[12]  Jonathan D. Cohen,et al.  Rubber hands ‘feel’ touch that eyes see , 1998, Nature.

[13]  Justin M. Harris,et al.  If I Were You: Perceptual Illusion of Body Swapping , 2008, PloS one.

[14]  E. Fetz,et al.  Direct control of paralyzed muscles by cortical neurons , 2008, Nature.

[15]  R. Passingham,et al.  That's My Hand! Activity in Premotor Cortex Reflects Feeling of Ownership of a Limb , 2004, Science.

[16]  Miguel A. L. Nicolelis,et al.  Real-time control of a robot arm using simultaneously recorded neurons in the motor cortex , 1999, Nature Neuroscience.