An Adaptive Convolutional Neural Network Framework for Multi-user Myoelectric Interfaces

Recently, the electromyogram (EMG)-based userinterfaces have developed for control of wearable rehabilitation robots such as arm prosthetics. In these interfaces, decoding of the user's movement intention is significant for controlling the robots properly. However, the high inter-user variations in EMG signals have disturbed to a stable decoding performance with multi-user. In this context, we developed an user-independent decoding method using the convolutional neural networks (CNN) for multi-user myoelectric interfaces. Specifically, we devise an user-adaptive framework based on the CNN for decoding of movement intentions using raw EMG signals. The Ninapro database was used to our experiments, and the experimental results show that our methods successfully decoded hand movement intentions. The effectiveness of the proposed method was also confirmed by experiment to decode movement intentions with across different subjects.