Context-aware and mobile interactive systems: the future of user interfaces plasticity

Mobility and integration of systems into the physical environment are key challenges for computer science. In particular, User Interfaces (UI) must accommodate variations in the context of use while preserving human-centered properties. We call this capacity UI plasticity. This tutorial begins by reviewing ideas from the last decade concerning the plasticity of user interfaces. From this starting point, it develops key ideas and perspectives for the near future. These are illustrated with a demo of a tool for prototyping plastic widgets and UIs. In the near future, there will be a need for elaborating a theory of adaptation to predict and explain the difficulties that users encounter when adaptation occurs. Secondly, in order to go beyond simplistic UI adaptation, there will be a need to bring together advances in several research areas including HCI (to support multimodality), Software Engineering (in particular, Model-Driven Engineering, Aspect Oriented Programming, as well as components and services, to cover both design time and run time adaptation), as well as Artificial Intelligence (to support situated information and planning). Indeed, in most current research, the user's task model is assumed as given and is used as the starting point for generating UIs on the fly. In addition, the functional core is considered to be stable rather than compliant with opportunistic discovery of services. In the coming years, we will need to confront challenges that go beyond HCI: (1) incompleteness and uncertainty of the system perception of both the context of use and of the appropriateness of the adapted UI; (2) combinatory explosion when composing a UI for sustaining emergent users goals. Finally, we will need to develop environments (or studios) for UI Plasticity to integrate partial advances, to make the theory operational and to alleviate designers and developers task.