A Myographic-based HCI Solution Proposal for Upper Limb Amputees

Abstract Interaction plays a fundamental role as it sets bridges between humans and computers. However, people with disability are prevented to use computers by the ordinary means, due to physical or intellectual impairments. Thus, the human-computer interaction (HCI) research area has been developing solutions to improve the technological accessibility of impaired people, by enhancing computers and similar devices with the necessary means to attend to the different disabilities, thereby contributing to reduce digital exclusion. Within the aforementioned scope, this paper presents an interaction solution for upper limb amputees, supported on a myographic gesture-control device named Myo. This device is an emergent wearable technology, which consists in a muscle-sensitive bracelet. It transmits myographic and inertial data, susceptible of being converted into actions for interaction purposes (e.g. clicking or moving a mouse cursor). Although being a gesture control armband, Myo can also be used in the legs, as was ascertained through some preliminary tests with users. Both data types (myographic and inertial) remain to be transmitted and are available to be converted into gestures. A general architecture, a use case diagram and the two main functional modules specification are presented. These will guide the future implementation of the proposed Myo-based HCI solution, which is intended to be a solid contribution for the interaction between upper limb amputees and computers.

[1]  Päivi Majaranta,et al.  Eye Tracking and Eye-Based Human–Computer Interaction , 2014 .

[2]  Balandino Di Donato,et al.  gSpat: Live sound spatialisation using gestural control , 2015 .

[3]  Jake Araullo,et al.  The Leap Motion controller: a view on sign language , 2013, OZCHI.

[4]  Calle Sjöström,et al.  Designing haptic computer interfaces for blind people , 2001, ISSPA.

[5]  Alexander Refsum Jensenius,et al.  Mumyo - evaluating and exploring the myo armband for musical interaction , 2015, NIME.

[6]  José Luís Pais Ribeiro,et al.  Implicações da situação profissional na qualidade de vida em indivíduos com esclerose múltipla , 2010 .

[7]  Karen Stendal,et al.  How do People with Disability Use and Experience Virtual Worlds and ICT: A Literature Review , 2012 .

[8]  Renato Kimura da Silva Interfaces naturais e o reconhecimento das línguas de sinais , 2013 .

[9]  Chris Roast,et al.  Exploring virtual reality and prosthetic training , 2015, 2015 IEEE Virtual Reality (VR).

[10]  Mithileysh Sathiyanarayanan,et al.  Map Navigation Using Hand Gesture Recognition: A Case Study Using MYO Connector on Apple Maps , 2015 .

[11]  Douglas Murphy Fundamentals of Amputation Care and Prosthetics , 2013 .

[12]  Mithileysh Sathiyanarayanan,et al.  Controlling a Robot Using a Wearable Device (MYO) , 2015 .

[13]  Alan J. Dix Human-Computer Interaction , 2018, Encyclopedia of Database Systems.

[14]  Fabiane Bridi ATENDIMENTO EDUCACIONAL ESPECIALIZADO , 2009 .

[15]  Zhengyou Zhang,et al.  Microsoft Kinect Sensor and Its Effect , 2012, IEEE Multim..

[16]  Imed Boughzala,et al.  Collaboration in Virtual Worlds: The Role of the Facilitator , 2012 .

[17]  Urmila Shrawankar,et al.  Speech user interface for computer based education system , 2010, 2010 International Conference on Signal and Image Processing.

[18]  Kurt E. Weaver,et al.  Functional characteristics of auditory cortex in the blind , 2009, Behavioural Brain Research.

[19]  Reuder Pereira Prado Avaliação do uso de uma interface homem-computador baseada em eletromiografia em indivíduos com e sem alterações neuro-motoras de membros superiores , 2013 .

[20]  Margarida Figueiredo,et al.  Artrite reumatóide: um estudo sobre a importância na artrite reumatóide da depressão e do ajustamento psicossocial à doença , 2004 .