Electromyography and inertial sensor-based gesture detection and control
暂无分享,去创建一个
With the advent of new technology the need of interacting with machines has increased. In order to improve this interaction it is necessary to create simple and intuitive interfaces which are accessible to everyone. Gestures are naturally used in the real world to interact with objects or transmit information. The ability to capture, recognize and interpret gestures may thus enable humans to communicate with machines and provide more natural ways of human-computer interaction (HCI). This dissertation had as main objective the development of an algorithm that is able to perform gesture pattern recognition, using information from electromyography (EMG) and inertial measurement unit (IMU) sensors. The EMG provides information on muscular activity and the IMU allows the measurement of movement, for example, the velocity and orientation of body segments. Using the Myo gesture control armband, a device recently introduced in the market that incorporates seven EMG channels and one IMU, a comparison between the built system and a system in the market was possible. The built system used, to acquire data, four EMG channels from BITalino device placed in the forearm, in combination with a smartwatch to acquire data from the IMU. Twelve gestures were correctly identified, including hand contraction and extension, wrist extension and flexion, snap the fingers, among others. For pattern recognition, using both systems, the twelve gestures were recorded, and data was pre-processed to enhance muscle activation and signal segmentation. Then, the best set of features, from both sensors, was chosen and finally, different classification techniques were applied. In the end, all gestures were successfully recognized with no difference between the systems. The use of information from the two sensors proved to be essential. The ability to recognize gestures enables the creation of new interaction techniques.