Visualization in Multimodal User Interfaces of Mobile Applications

Advanced user interfaces are a crucial factor in the success of mobile information systems employed by different users on a variety of devices. They should provide state-of-the-art visualization and interaction techniques taylored for specific tasks, while at the same time allow flexible deployment of these components on a multitude of (mobile) hardware platforms. Especially visualizations have to adapt to the platform capabilities in order to remain not only effective, but also adequate. Focus & Context techniques are one way to efficiently make use of displays with low resolution and size, as are lens techniques. Here, a good tradeoff between complexity and response time is important. Also, complex inputs are not feasible on most mobile devices. Simple, straightforward, context-driven interaction options must be presented to a user. The above can be archieved by integrating a task model, user (role) and resource models as well as multimodal interaction facilities such as speech recognition, into the user interface component of mobile information systems. The talk will report on research on the above aspects within the M6C project.