mobile terminals, be it cell phones, PDAs, or gaming consoles, the range of features becomes wider. And the way interactive content is presented becomes increasingly diverse, yet the ways to interact with applications, the user-interface capabilities, remain restricted to a small screen, audio input and output, and occasionally a stylus pointing instrument. On the other side, the environment in which mobile communication occurs becomes ever more crowded with devices offering wide ranges of interface modalities. We designed a scheme that when a mobile user comes into the physical range of devices hosting modalities (screens/audio systems, etc), these ambient interface devices are bound into an overall multimodal user interface. Literally, whenever suitable devices are available, the user interface can be tailored to the needs of the user, as well as the requirements of the application. While the principle is rather straightforward, there are a number of associated problems, including:
[1]
Cecilia Mascolo,et al.
Satin: A Component Model for Mobile Self Organisation
,
2004,
CoopIS/DOA/ODBASE.
[2]
Raimund Schatz,et al.
Device independent mobile multimodal user interfaces with the MONA Multimodal Presentation Server
,
2005
.
[3]
Marija Mikic-Rakic,et al.
Support for disconnected operation via architectural self-reconfiguration
,
2004
.
[4]
Marija Mikic-Rakic,et al.
Support for disconnected operation via architectural self-reconfiguration
,
2004,
International Conference on Autonomic Computing, 2004. Proceedings..
[5]
Tim Owen,et al.
Middleware for user controlled environments
,
2005,
Third IEEE International Conference on Pervasive Computing and Communications Workshops.