Shoot & copy: phonecam-based information transfer from public displays onto mobile phones

Large public displays have become pervasive in our everyday lives, but up to now, they are mostly information screens without any interaction possibilities. Users tend to forget what they saw relatively fast after leaving such a display. In this paper, we present a new interaction technique for transferring information from a public display onto a personal mobile phone with its built-in camera. Instead of having to rely on their memory, users simply take a picture of the information of interest. Instead of just storing the image, our system then retrieves the actual data represented on the screen, such as a stock quote, news text, or piece of music. The Shoot & Copy technique does not require visual codes that interfere with shown content or reduce screen real estate. Our prototype allows users to capture an arbitrary region of a standard desktop screen, containing icons, which represent pieces of data. The captured image is then analyzed and a reference to the corresponding data is sent back to the mobile phone. Once the user has time to view the information in more detail, our system allows retrieving the actual data from this reference. We present our prototype and the methods it uses for image processing, as well as an evaluation of our interaction technique illustrating its potential use and applications.

[1]  Michael Rohs,et al.  BYOD: bring your own device , 2004 .

[2]  Enrico Costanza,et al.  Telling a Story on a Tag : The Importance of Markers ' Visual Design for Real World Applications , 2006 .

[3]  Albrecht Schmidt,et al.  Supporting Mobile Service Usage through Physical Mobile Interaction , 2007, Fifth Annual IEEE International Conference on Pervasive Computing and Communications (PerCom'07).

[4]  Suguru Higashino,et al.  C-blink: a hue-difference-based light signal marker for large screen interaction via any mobile terminal , 2004, UIST '04.

[5]  Michael Rohs,et al.  Sweep and point and shoot: phonecam-based interactions for large public displays , 2005, CHI Extended Abstracts.

[6]  Albrecht Schmidt,et al.  Physical posters as gateways to context-aware services for mobile devices , 2004, Sixth IEEE Workshop on Mobile Computing Systems and Applications.

[7]  Michael Rohs,et al.  USING CAMERA-EQUIPPED MOBILE PHONES FOR INTERACTING WITH REAL-WORLD OBJECTS , 2004 .

[8]  Jun Rekimoto,et al.  Pick-and-drop: a direct manipulation technique for multiple computer environments , 1997, UIST '97.

[9]  Hirokazu Kato,et al.  Marker tracking and HMD calibration for a video-based augmented reality conferencing system , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[10]  Jun Rekimoto,et al.  InfoPoint: A Device that Provides a Uniform User Interface to Allow Appliances to Work Together over a Network , 2001, Personal and Ubiquitous Computing.

[11]  Michael Boyle,et al.  PDAs and shared public displays: Making personal information public, and public information personal , 1999, Personal Technologies.

[12]  Michael Rohs,et al.  Real-World Interaction with Camera Phones , 2004, UCS.