Interaction with Adaptive and Ubiquitous User Interfaces

Current user interfaces such as public displays, smartphones and tablets strive to provide a constant flow of information. Although they all can be regarded as a first step towards Mark Weiser’s vision of ubiquitous computing they are still not able to fully achieve the ubiquity and omnipresence Weiser envisioned. In order to achieve this goal these devices must be able to blend in with their environment and be constantly available. Since this scenario is technically challenging, researchers simulated this behavior by using projector-camera systems. This technology opens the possibility of investigating the interaction of users with always available and adaptive information interfaces. These are both important properties of a Companion-technology. Such a Companion system will be able to provide users with information how, where and when they are desired. In this chapter we describe in detail the design and development of three projector-camera systems(UbiBeam, SpiderLight and SmarTVision). Based on insights from prior user studies, we implemented these systems as a mobile, nomadic and home deployed projector-camera system which can transform every plain surface into an interactive user interface. Finally we discuss the future possibilities for Companion-systems in combination with a projector-camera system to enable fully adaptive and ubiquitous user interface.

[1]  Enrico Rukzio,et al.  UbiBeam: An Interactive Projector-Camera System for Domestic Deployment , 2014, ITS '14.

[2]  Max Mühlhäuser,et al.  LightBeam: interacting with augmented real-world objects in pico projections , 2012, MUM.

[3]  Enrico Rukzio,et al.  UbiBeam: Exploring the Interaction Space for Home Deployed Projector-Camera Systems , 2015, INTERACT.

[4]  Claudio S. Pinhanez The Everywhere Displays Projector: A Device to Create Ubiquitous Graphical Interfaces , 2001, UbiComp.

[5]  Pattie Maes,et al.  SixthSense: a wearable gestural interface , 2009, SIGGRAPH ASIA Art Gallery & Emerging Technologies.

[6]  Andreas Wendemuth,et al.  Companion-Technology for Cognitive Technical Systems , 2016, KI - Künstliche Intelligenz.

[7]  Andrew D. Wilson Using a depth camera as a touch sensor , 2010, ITS '10.

[8]  James H. Aylor,et al.  Computer for the 21st Century , 1999, Computer.

[9]  John Hardy,et al.  Toolkit support for interactive projected displays , 2012, MUM.

[10]  Jun Rekimoto,et al.  Brainy hand: an ear-worn hand gesture interaction device , 2009, CHI Extended Abstracts.

[11]  A. Strauss,et al.  Basics of Qualitative Research , 1992 .

[12]  Pattie Maes,et al.  LuminAR: portable robotic augmented reality interface design and prototype , 2010, UIST '10.

[13]  Otmar Hilliges,et al.  Steerable augmented reality with the beamatron , 2012, UIST.

[14]  Jorge Gonçalves,et al.  Contextual experience sampling of mobile application micro-usage , 2014, MobileHCI '14.

[15]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[16]  John Hardy Reflections: a year spent with an interactive desk , 2012, INTR.

[17]  Mark Weiser,et al.  The computer for the 21st Century , 1991, IEEE Pervasive Computing.

[18]  Enrico Rukzio,et al.  Pervasive information through constant personal projection: the ambient mobile pervasive display (AMP-D) , 2014, CHI.

[19]  Kosuke Sato,et al.  A wearable mixed reality with an on-board projector , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[20]  Marc Hassenzahl,et al.  The Thing and I: Understanding the Relationship Between User and Product , 2005, Funology.

[21]  Enrico Rukzio,et al.  Interactive phone call: synchronous remote collaboration and projected interactive surfaces , 2011, ITS '11.

[22]  Frank Honold,et al.  How Companion-Technology can Enhance a Multi-Screen Television Experience: A Test Bed for Adaptive Multimodal Interaction in Domestic Environments , 2016, KI - Künstliche Intelligenz.