Sensing and decoding of visual stimuli using commercial Brain Computer Interface technology

This paper presents experiments using Brain Computer Interface technology and artificial neural networks to identify simple images viewed by a human subject. Electro-encephalograph (EEG) data is collected from subjects viewing images made up of 2×2 black and white squares using Matlab software and the commercially available Emotiv Epoc headset. Artificial neural networks (ANNs) are used to map EEG data to a pixel array representing the image the subject is viewing. ANNs emulate a biological brain as an array of interconnected nodes which can be trained to match an arbitrary set of inputs to a given set of outputs. In this way, the neural network can map EEG data to a particular image the subject was viewing, allowing the network to classify new EEG image data from other human subjects.