Identification of static and dynamic muscle activation patterns for intuitive human/computer interfaces

The goal of this pilot research was to create an intuitive human-computer interface that would allow control of a robotic arm using the electromyographic (EMG) signals from a person's own arm movements (or muscle activations). There is enough information contained within EMG data to accurately differentiate between different movements based on the observed muscle strategy. After designing an algorithm, accurate prediction of arm movements was obtained; it determined whether the test subject's arm was moving up, down, left, right, or closing a fist, and also what base position the test subject was in if not moving. A successful interface was designed for using EMG data with a robotic arm, moving the robotic arm in the same direction that the test subject's arm moved, replicating a static position with the arm, and grabbing a piece of Styrofoam. With further research and refinement, this library of kinesiological movements can be expanded to encapsulate the spectrum of human arm movement.