Pacer: fine-grained interactive paper via camera-touch hybrid gestures on a cell phone

PACER is a gesture-based interactive paper system that supports fine-grained paper document content manipulation through the touch screen of a cameraphone. Using the phone's camera, PACER links a paper document to its digital version based on visual features. It adopts camera-based phone motion detection for embodied gestures (e.g. marquees, underlines and lassos), with which users can flexibly select and interact with document details (e.g. individual words, symbols and pixels). The touch input is incorporated to facilitate target selection at fine granularity, and to address some limitations of the embodied interaction, such as hand jitter and low input sampling rate. This hybrid interaction is coupled with other techniques such as semi-real time document tracking and loose physical-digital document registration, offering a gesture-based command system. We demonstrate the use of PACER in various scenarios including work-related reading, maps and music score playing. A preliminary user study on the design has produced encouraging user feedback, and suggested future research for better understanding of embodied vs. touch interaction and one vs. two handed interaction.

[1]  Xu Liu,et al.  Mobile Retriever: access to digital documents from their physical source , 2008, International Journal of Document Analysis and Recognition (IJDAR).

[2]  Michael Rohs,et al.  Target acquisition with camera phones when used as magic lenses , 2008, CHI.

[3]  Ann Morrison,et al.  Like bees around the hive: a comparative study of a mobile augmented reality map , 2009, CHI.

[4]  Glen Hart,et al.  MapSnapper: engineering an efficient algorithm for matching images of maps from mobile phones , 2008, Electronic Imaging.

[5]  Paul A. Beardsley,et al.  Zoom-and-pick: facilitating visual zooming and precision pointing with interactive handheld projectors , 2005, UIST.

[6]  Kori Inkpen Quinn,et al.  Marked-up maps: combining paper maps and electronic information resources , 2006, Personal and Ubiquitous Computing.

[7]  Daniel Vogel,et al.  Shift: a technique for operating pen-based interfaces using touch , 2007, CHI.

[8]  François Guimbretière,et al.  Paper augmented digital documents , 2003, UIST '03.

[9]  Michael Rohs,et al.  Real-World Interaction with Camera Phones , 2004, UCS.

[10]  Wendy E. Mackay,et al.  Musink: composing music through augmented drawing , 2009, CHI.

[11]  Scott E. Hudson,et al.  PaperLink: a technique for hyperlinking from real paper to electronic content , 1997, CHI.

[12]  Benjamin B. Bederson,et al.  AppLens and launchTile: two designs for one-handed thumb use on small devices , 2005, CHI.

[13]  Nadir Weibel,et al.  Paperproof: a paper-digital proof-editing system , 2008, CHI Extended Abstracts.

[14]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[15]  Berna Erol,et al.  Paper-Based Augmented Reality , 2007, 17th International Conference on Artificial Reality and Telexistence (ICAT 2007).

[16]  Maneesh Agrawala,et al.  Video-based document tracking: unifying your physical and electronic desktops , 2004, UIST '04.

[17]  Dieter Schmalstieg,et al.  Pose tracking from natural features on mobile phones , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[18]  Shumin Zhai,et al.  Camera phone based motion sensing: interaction techniques, applications and performance study , 2006, UIST.

[19]  Shumin Zhai,et al.  High precision touch screen interaction , 2003, CHI '03.

[20]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[21]  Patrick Baudisch,et al.  Design and analysis of delimiters for selection-action pen gesture phrases in scriboli , 2005, CHI.

[22]  James D. Hollan,et al.  Papiercraft: A gesture-based command system for interactive paper , 2008, TCHI.

[23]  Christine Reid,et al.  The Myth of the Paperless Office , 2003, J. Documentation.

[24]  Berna Erol,et al.  HOTPAPER: multimedia interaction with paper using mobile phones , 2008, ACM Multimedia.

[25]  Gordon Kurtenbach,et al.  The design and evaluation of marking menus , 1993 .

[26]  David G. Lowe,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004, International Journal of Computer Vision.

[27]  Rafael Ballagas,et al.  Unravelling seams: improving mobile gesture recognition with visual feedback techniques , 2009, CHI.

[28]  Lynn Wilcox,et al.  High accuracy and language independent document retrieval with a Fast Invariant Transform , 2009, 2009 IEEE International Conference on Multimedia and Expo.

[29]  Scott R. Klemmer,et al.  ButterflyNet: a mobile capture and access system for field biology research , 2006, CHI.

[30]  Johannes Schöning,et al.  Map navigation with mobile devices: virtual versus physical movement with and without visual context , 2007, ICMI '07.

[31]  Wendy E. Mackay,et al.  The missing link: augmenting biology laboratory notebooks , 2002, UIST '02.