Experiments with Adaptable Multimodal User Interfaces for Mobile Devices

Our daily living environment contains more and more technical systems, which need human control. Modern mobile devices offer functionalities to act as generic control devices for these technical systems. Besides classical user interactions they provide components to process new ways of human-machine interactions like speech and touch inputs. In this article, we describe a software framework to support multimodal inputs for Android based mobile devices. We exploit this framework in a case study to develop adaptable, multimodal user interfaces for two different instances of robot controls. Furthermore, we present some results from evaluating these interfaces with a couple of test persons.