Demonstrating Interactive Machine Learning Tools for Rapid Prototyping of Gestural Instruments in the Browser

These demonstrations will allow visitors to prototype gestural, interactive musical instruments in the browser. Different browser based synthesisers can be controlled by either a Leap Motion sensor or a Myo armband. The visitor will be able to use an interactive machine learning toolkit to quickly and iteratively explore different interaction possibilities. The demonstrations show how interactive, browser-based machine learning tools can be used to rapidly prototype gestural controllers for audio. These demonstrations showcase RapidLib, a browser based machine learning library developed through the RAPID-MIX project. 1.! INTRODUCTION Music is an ideal use case for machine learning, for two primary reasons. 1.! Music software often has a complex interface and multiple parameters that a performer might want to control in order to expressively modulate a sound. 2.! Musicians often have precise, embodied knowledge about gestural interaction and control of sound, developed through years of instrumental practice. Many of the ways in which we interface with audio software on computers fail to facilitate the effective control of multiple parameters, and do not exploit the rich gestural language of musicians. For instance, keyboard and mouse offer limited interaction possibilities. Trying to adjust arrays of knobs and sliders on a software synthesiser GUI using just mouse clicks can be awkward and cumbersome. Musicians therefore often use specialist interfaces for musical control of computers. Commercially available MIDI keyboards and control surfaces allow for a wider range of interaction possibilities than keyboard and music. More adventurous modes of interaction are explored through conferences such as NIME (New Interfaces 1 http://maximilian.strangeloop.co.uk/ for Musical Expression) [3]. From Max Mathews’s Radio-Baton to Michel Waisvisz’s Hands [4], computer musicians have explored novel ways of interacting with computers. Machine learning could provide a solution for connecting the embodied knowledge of musicians to the multiple parameters of a software instrument. Interactive Machine Learning (IML), in particular, allows for intuitive control of complex systems which gives particular ability to refine systems to the end-user [2]. Musicians can therefore bring their embodied knowledge to gestural controllers, refining their interactions with a software instrument through multiple iterations. Without typing a line of code, complex and expressive musical interactions can be created. 2.! SOFTWARE The demonstrations run in CodeCircle using the MaxiLib and RapidLib libraries. CodeCircle is an online editor that enables real-time collaborative coding (see Figure 1). It is geared towards creative contexts and works with HTML, CSS, JavaScript and several third-party media libraries. CodeCircle was developed by Fiala, Yee-King and Grierson [1]. Synthesis is handled by Mick Grierson’s Maximilian which runs in CodeCircle as a JavaScript Library, MaxiLib. MaxiLib and Maximilian are open source libraries for audio synthesis and signal processing. They contain standard waveforms, sample playback, filters with resonance, delay lines, FFTs, granular synthesis and low level feature extraction. Licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0). Attribution: owner/author(s). Web Audio Conference WAC-2017, August 21–23, 2017, London, UK. © 2017 Copyright held by the owner/author(s). Figure 1. Leap Motion running in CodeCircle, tracking one hand. Interactive machine learning is handled by RapidLib, a machine learning library in CodeCircle. RapidLib was developed through the Real-time Adaptive Prototyping for Industrial Design of Multimodal Interactive and eXpressive technologies (RAPIDMIX) project. CodeCircle running with RapidLib and MaxiLib provides an ideal environment for collaboratively developing instruments in the browser. Little installed software is required and the envrionment will run on any computer with internet access and a modern browser. The interface remains the same in Linux, OSX and Windows and completed projects can be exported and run offline. A project can be accessed and forked by multiple users, and documents are reactively updated. 3.! HARDWARE There are two pieces of hardware used in the demonstration, a Myo armband and a Leap Motion controller. These allow for a range of gestural interactions with a computer. The Myo armband is developed by Thalmic Labs. It is worn on the forearm, and communicates with the computer via Bluetooth. It has 8 channels of EMG (Electromyographic) data, accelerometer, gyroscope and magnetometer. The Leap Motion is a USB device for gesture tracking, which uses a 3d array of cameras. The device can give tracking information for two hands and ten fingers, giving real time updates at a rate of around 200 frames per second. These hardware devices represent the state of the art in affordable consumer products that offer real time, gestural interaction with computers, and can be used as musical interfaces (Figure 2). 4.! INTERACTION Visitors will be able to choose from a several different browser based synthesisers and samplers running in CodeCircle. The visitor will be free to explore their chosen synthesiser, finding different sounds according to their taste. Using either the Myo armband or the Leap Motion, the visitor will create a set of gestures associated with desired or interesting sounds. The system will be then trained using these sets of sounds and gestures, and within a few seconds the visitor will be able to experiment with an interactive space. The system can then be refined as it is trained with further combinations of sound and gesture. A selection of demos are accessible through http://doc.gold.ac.uk/eavi/rapidmixapi.com/index.php/examples/ja vascript/ 5.! TECHNICAL REQUIREMENTS The demo can be scaled down or up to save space or be accessible to more visitors, as is required. The following allows for two simultaneous demonstrations: 1× mains power socket 1× table (approx 1.2m * 0.6m) We provide, as a minimum: