Convolutional and Recurrent Neural Networks for Physical Action Forecasting by Brain-Computer Interface

Recently deep neural networks (DNNs) were intensively investigated for analysis of time sequences like electroencephalography (EEG) signals that can be measured by brain-computer interface (BCI). This work is dedicated to investigation of EEG data gathered by BCI and classification of basic physical actions by various DNNs. Several hybrid DNNs were considered as combinations of convolutional neural networks (CNNs), fuly connected netwroks (FCNs), and recurrent neural networks (RNNs) like gated recurrent unit (GRU) and long short-term memory (LSTM) blocks to classify physical actions (hand movements here) collected in the grasp-and-lift (GAL) dataset. The results obtained allow us to conclude that some of these hybrid DNNs can be used to classify physical actions reliably by some small and simple combinations with the low resource requirements for porting such hybrid models for Edge Computing level on devices with the limited computational resources.