New Interfaces and Approaches to Machine Learning When Classifying Gestures within Music

Interactive music uses wearable sensors (i.e., gestural interfaces-GIs) and biometric datasets to reinvent traditional human-computer interaction and enhance music composition. In recent years, machine learning (ML) has been important for the artform. This is because ML helps process complex biometric datasets from GIs when predicting musical actions (termed performance gestures). ML allows musicians to create novel interactions with digital media. Wekinator is a popular ML software amongst artists, allowing users to train models through demonstration. It is built on the Waikato Environment for Knowledge Analysis (WEKA) framework, which is used to build supervised predictive models. Previous research has used biometric data from GIs to train specific ML models. However, previous research does not inform optimum ML model choice, within music, or compare model performance. Wekinator offers several ML models. Thus, we used Wekinator and the Myo armband GI and study three performance gestures for piano practice to solve this problem. Using these, we trained all models in Wekinator and investigated their accuracy, how gesture representation affects model accuracy and if optimisation can arise. Results show that neural networks are the strongest continuous classifiers, mapping behaviour differs amongst continuous models, optimisation can occur and gesture representation disparately affects model mapping behaviour; impacting music practice.

[1]  Aldo Dagnino,et al.  An initial study of predictive machine learning analytics on large volumes of historical data for power system applications , 2014, 2014 IEEE International Conference on Big Data (Big Data).

[2]  Y. Zhao,et al.  Comparison of decision tree methods for finding active objects , 2007, 0708.4274.

[3]  Marcelo M. Wanderley,et al.  Mapping performer parameters to synthesis engines , 2002, Organised Sound.

[4]  Angkoon Phinyomark,et al.  Feature Extraction and Selection for Myoelectric Control Based on Wearable EMG Sensors , 2018, Sensors.

[5]  James M. Keller,et al.  A fuzzy K-nearest neighbor algorithm , 1985, IEEE Transactions on Systems, Man, and Cybernetics.

[6]  Atau Tanaka Sensor-Based Musical Instruments and Interactive Music , 2011 .

[7]  Tom M. Mitchell,et al.  Machine Learning and Data Mining , 2012 .

[8]  Esteban Alfaro Cortés,et al.  Multiclass Corporate Failure Prediction by Adaboost.M1 , 2007 .

[9]  Philip Hayward Danger! Retro-Affectivity! , 1997 .

[10]  Patricio Zambrano,et al.  Real-time hand gesture recognition using the Myo armband and muscle activity detection , 2017, 2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM).

[11]  Suman,et al.  Comparative Analysis of Classification Algorithms on Different Datasets using WEKA , 2012 .

[12]  Bert Bongers,et al.  An Interview with Sensorband , 1998 .

[13]  Marcelo M. Wanderley,et al.  A Method and Toolkit for Digital Musical Instruments: Generating Ideas and Prototypes , 2017, IEEE MultiMedia.

[14]  Dan Wu,et al.  Music Composition from the Brain Signal: Representing the Mental State by Music , 2010, Comput. Intell. Neurosci..

[15]  Giuseppe Torre,et al.  The Hands: The Making of a Digital Musical Instrument , 2016, Computer Music Journal.

[16]  J. Rafiee,et al.  Feature extraction of forearm EMG signals for prosthetics , 2011, Expert Syst. Appl..

[17]  Carlo Fischione,et al.  Internet of Musical Things: Vision and Challenges , 2018, IEEE Access.

[18]  A. P. Calitz,et al.  Evaluating a biosensor-based interface to recognize hand-finger gestures using a Myo armband , 2018, SAICSIT.

[19]  Pornchai Phukpattaranont,et al.  EMG AMPLITUDE ESTIMATORS BASED ON PROBABILITY DISTRIBUTION FOR MUSCLE–COMPUTER INTERFACE , 2013 .

[20]  Perry R. Cook,et al.  Human model evaluation in interactive supervised learning , 2011, CHI.

[21]  Pietro Zanuttigh,et al.  Hand gesture recognition with leap motion and kinect devices , 2014, 2014 IEEE International Conference on Image Processing (ICIP).

[22]  Rafael Ramírez,et al.  Bowing Gestures Classification in Violin Performance: A Machine Learning Approach , 2019, Front. Psychol..

[23]  Giovanni Santini Synesthesizer: Physical Modelling and Machine Learning for a Color-Based Synthesizer in Virtual Reality , 2019, MCM.

[24]  Nikolaos D. Doulamis,et al.  Gesture-based video summarization , 2005, IEEE International Conference on Image Processing 2005.

[25]  Rafael Ramírez,et al.  Air violin: a machine learning approach to fingering gesture recognition , 2017, MIE@ICMI.

[26]  Jie Liu,et al.  Air-Ukulele: Finger Movement Detection Based on sEMG for Ukulele playing , 2018, CCHI.

[27]  S. Bini Artificial Intelligence, Machine Learning, Deep Learning, and Cognitive Computing: What Do These Terms Mean and How Will They Impact Health Care? , 2018, The Journal of arthroplasty.

[28]  Isabella Poggi,et al.  Gestures in performance , 2009 .

[29]  I. A. Sulistijono,et al.  Comparison of five time series EMG features extractions using Myo Armband , 2015, 2015 International Electronics Symposium (IES).