Interaction Tasks and Controls for Public Display Applications

Public displays are becoming increasingly interactive and a broad range of interaction mechanisms can now be used to create multiple forms of interaction. However, the lack of interaction abstractions forces each developer to create specific approaches for dealing with interaction, preventing users from building consistent expectations on how to interact across different display systems. There is a clear analogy with the early days of the graphical user interface, when a similar problem was addressed with the emergence of high-level interaction abstractions that provided consistent interaction experiences to users and shielded developers from low-level details. This work takes a first step in that same direction by uncovering interaction abstractions that may lead to the emergence of interaction controls for applications in public displays. We identify a new set of interaction tasks focused on the specificities of public displays; we characterise interaction controls that may enable those interaction tasks to be integrated into applications; we create a mapping between the high-level abstractions provided by the interaction tasks and the concrete interaction mechanisms that can be implemented by those displays. Together, these contributions constitute a step towards the emergence of programming toolkits with widgets that developers could incorporate into their public display applications.

[1]  Marko Jurmu,et al.  Multipurpose Interactive Public Displays in the Wild: Three Years Later , 2012, Computer.

[2]  Rui José,et al.  Interaction tasks and controls for public display applications - list of analysed publications , 2014 .

[3]  Nigel Davies,et al.  Using bluetooth device names to support interaction in smart environments , 2009, MobiSys '09.

[4]  Khai N. Truong,et al.  BlueTone: a framework for interacting with public displays using dual-tone multi-frequency through bluetooth , 2009, UbiComp.

[5]  Jock D. Mackinlay,et al.  A semantic analysis of the design space of input devices , 1990 .

[6]  Michael Rohs,et al.  Visual code widgets for marker-based interaction , 2005, 25th IEEE International Conference on Distributed Computing Systems Workshops.

[7]  Keith Cheverst,et al.  Exploring bluetooth based mobile phone interaction with the hermes photo display , 2005, Mobile HCI.

[8]  Michael Rohs,et al.  Sweep and point and shoot: phonecam-based interactions for large public displays , 2005, CHI Extended Abstracts.

[9]  A. Strauss,et al.  The Discovery of Grounded Theory , 1967 .

[10]  Rui José,et al.  PuReWidgets: a programming toolkit for interactive public display applications , 2012, EICS '12.

[11]  Brad A. Myers A new model for handling input , 1990, TOIS.

[12]  Alan J. Dix,et al.  A taxonomy for and analysis of multi-person-display ecosystems , 2009, Personal and Ubiquitous Computing.

[13]  Daniel Vogel,et al.  Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users , 2004, UIST '04.

[14]  A. J. Lembo,et al.  Software for the User Interface of Coordinate Rubbersheeting in Vector-based Geographic Information Systems , 2003 .

[15]  Elizabeth F. Churchill,et al.  Digital graffiti: public annotation of multimedia content , 2004, CHI EA '04.

[16]  Elizabeth F. Churchill,et al.  Sharing multimedia content with interactive public displays: a case study , 2004, DIS '04.

[17]  Brad A. Myers,et al.  Collaboration using multiple PDAs connected to a PC , 1998, CSCW '98.

[18]  Antonietta Grasso,et al.  Supporting Communities of Practice with Large Screen Displays , 2003 .

[19]  Robert Hardy,et al.  Touch & interact: touch-based interaction of mobile phones with displays , 2008, Mobile HCI.

[20]  Shelly Farnham,et al.  Supporting community in third places with situated social software , 2009, C&T.

[21]  Antti Oulasvirta,et al.  It's Mine, Don't Touch!: interactions at a large multi-touch display in a city centre , 2008, CHI.

[22]  Rui José,et al.  Beyond interaction: tools and practices for situated publication in display networks , 2012, PerDis '12.

[23]  Mark S. Ackerman,et al.  The X Toolkit: More Bricks for Building User-Interfaces or Widgets for Hire , 1988, USENIX Winter.

[24]  Marko Jurmu,et al.  UBI-Hotspot 1.0: Large-Scale Long-Term Deployment of Interactive Public Displays in a City Center , 2010, 2010 Fifth International Conference on Internet and Web Applications and Services.

[25]  Colin Potts,et al.  Design of Everyday Things , 1988 .

[26]  Leonard J. Bass,et al.  Developing software for the user interface , 1991, The SEI series in software engineering / Software Engineering Institute.

[27]  Fabio Paternò,et al.  MARIA: A universal, declarative, multiple abstraction-level language for service-oriented applications in ubiquitous environments , 2009, TCHI.

[28]  James D. Foley,et al.  The human factors of computer graphics interaction techniques , 1984, IEEE Computer Graphics and Applications.

[29]  Dejan S. Milojicic,et al.  ContentCascade incremental content exchange between public displays and personal devices , 2004, The First Annual International Conference on Mobile and Ubiquitous Systems: Networking and Services, 2004. MOBIQUITOUS 2004..

[30]  Florian Alt,et al.  Looking glass: a field study on noticing interactivity of a shop window , 2012, CHI.

[31]  David W. McDonald,et al.  Proactive displays: Supporting awareness in fluid social environments , 2008, TCHI.

[32]  Albrecht Schmidt,et al.  Open Display Networks: A Communications Medium for the 21st Century , 2012, Computer.

[33]  Chris Schmandt,et al.  Aware Community Portals: Shared Information Appliances for Transitional Spaces , 2001, Personal and Ubiquitous Computing.

[34]  Clodoaldo Robledo,et al.  Google Web Toolkit , 2012 .

[35]  Alan Cooper,et al.  About Face 3: the essentials of interaction design , 1995 .

[36]  Rui José,et al.  Instant Places: Using Bluetooth for Situated Interaction in Public Displays , 2008, IEEE Pervasive Computing.

[37]  Meredith Ringel Morris,et al.  Code space: touch + air gesture hybrid interactions for supporting developer meetings , 2011, ITS '11.

[38]  Rui José,et al.  Evaluation of a programming toolkit for interactive public display applications , 2013, MUM.

[39]  Ben Shneiderman,et al.  Direct Manipulation: A Step Beyond Programming Languages , 1983, Computer.

[40]  G. F. P. Deecker,et al.  Standard input forms for interactive computer graphics , 1977, COMG.

[41]  Alan Penn,et al.  Engaging with a situated display via picture messaging , 2006, CHI EA '06.

[42]  Yvonne Rogers,et al.  The introduction of a shared interactive surface into a communal space , 2004, CSCW.

[43]  Jock D. Mackinlay,et al.  A Semantic Analysis of the Design Space of Input Devices , 1990, Hum. Comput. Interact..

[44]  A. Dix Mobile Personal Devices meet Situated Public Displays: Synergies and Opportunities , 2010 .

[45]  Kentaro Toyama,et al.  Toward universal mobile interaction for shared displays , 2004, CSCW.

[46]  Michael Rohs,et al.  The Design Space of Ubiquitous Mobile Input , 2008 .

[47]  Alireza Sahami Shirazi,et al.  Digifieds: insights into deploying digital public notice areas in the wild , 2011, MUM.

[48]  Richard Sharp,et al.  Using smart phones to access site-specific services , 2005, IEEE Pervasive Computing.

[49]  Caroline E. Wardle ARTSPEAK: A graphics language for artists , 1976, SIGGRAPH 1976.

[50]  Daniel M. Russell,et al.  On the Design of Personal & Communal Large Information Scale Appliances , 2001, UbiComp.

[51]  Alessio Malizia,et al.  Human-Display Interaction Technology: Emerging Remote Interfaces for Pervasive Display Environments , 2010, IEEE Pervasive Computing.

[52]  Ian Oakley,et al.  Open Sesame: Design Guidelines for Invisible Passwords , 2012, Computer.