The Fusion of an Ultrasonic and Spatially Aware System in a Mobile-Interaction Device

Over the past four decades, the prophecy from computer pundits and prognosticators pointed to the looming arrival of the paperless office era. However, forty years later, physical paper documents are still playing a significant role due to the ease of use, superior readability, and availability. The drawbacks of paper sheets are that they are hard to modify and retrieve, have limited space, and are environmentally unfriendly. Augmenting paper documents with digital information from mobile devices extends the two-dimensional space of physical paper documents. Various camera-based recognition and detection devices have been proposed to augment paper documents with digital information. However, there are still some limitations that exist in these systems. This paper presents a novel, low cost, spatially aware, mobile system called Ultrasonic PhoneLens. The Ultrasonic PhoneLens adopts two-dimensional dynamic image presentation and ultrasonic sound positioning techniques. It consists of two ultrasonic sound sensors, one Arduino mini-controller board, and one android mobile device. Based on the location of the mobile device over the physical paper, Ultrasonic PhoneLens can retrieve pre-saved digital information from a mobile database for the object (such as a text, a paragraph, or an image) in a paper document. An empirical study was conducted to evaluate the system performance. The results indicate that our system has a better performance in tasks such as browsing multivalent documents and sharing digital information than the Wiimote PhoneLens system.

[1]  S. Koceski,et al.  Methodology for calculating the vertical displacement of the body center of mass using mobile phone accelerometer , 2012, 2012 Mediterranean Conference on Embedded Computing (MECO).

[2]  K. A Drishya,et al.  Localization and Mapping of Flying Robot in GPS-Denied Environments , 2015 .

[3]  Joy Bose,et al.  A system for intelligent context based content mode in camera applications , 2014, 2014 International Conference on Advances in Computing, Communications and Informatics (ICACCI).

[4]  Scott R. Klemmer,et al.  Books with voices: paper transcripts as a physical interface to oral histories , 2003, CHI '03.

[5]  Tovi Grossman,et al.  MouseLight: bimanual interactions on digital paper using a pen and a spatially-aware mobile projector , 2010, CHI.

[6]  P. Chmelar,et al.  The fusion of ultrasonic and optical measurement devices for autonomous mapping , 2013, 2013 23rd International Conference Radioelektronika (RADIOELEKTRONIKA).

[7]  Yongtian Wang,et al.  Study on an indoor tracking system with infrared projected markers for large-area applications , 2009, VRCAI '09.

[8]  Wendy E. Mackay,et al.  PaperComposer: creating interactive paper interfaces for music composition , 2014, IHM.

[9]  Bernhard Mitschang,et al.  A Model-Based, Open Architecture for Mobile, Spatially Aware Applications , 2001, SSTD.

[10]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[11]  M. Abramowitz,et al.  Handbook of Mathematical Functions With Formulas, Graphs and Mathematical Tables (National Bureau of Standards Applied Mathematics Series No. 55) , 1965 .

[12]  Kori Inkpen Quinn,et al.  Marked-up maps: combining paper maps and electronic information resources , 2006, Personal and Ubiquitous Computing.

[13]  Sukhyun Lim,et al.  An augmented reality-based authoring tool for E-learning applications , 2011, Multimedia Tools and Applications.

[14]  InkpenKori,et al.  Marked-up maps: combining paper maps and electronic information resources , 2006 .

[15]  Luca Mottola,et al.  A Self-Repairing Tree Topology Enabling Content-Based Routing in Mobile Ad Hoc Networks , 2008, IEEE Transactions on Mobile Computing.

[16]  Khairurrijal,et al.  Design of 3D scanner for surface contour mapping by ultrasonic sensor , 2015 .

[17]  Enrico Rukzio,et al.  Penbook: bringing pen+paper interaction to a tablet device to facilitate paper-based workflows in the hospital domain , 2013, ITS.

[18]  E. Kuantama,et al.  Early flood alerts using Short Message Service (SMS) , 2012, 2012 International Conference on System Engineering and Technology (ICSET).

[19]  N. M. Singh,et al.  Multi Robot Communication and Target Tracking system with controller design and implementation of SWARM robot using Arduino , 2015, 2015 International Conference on Industrial Instrumentation and Control (ICIC).

[20]  Hans P. Moravec,et al.  High resolution maps from wide angle sonar , 1985, Proceedings. 1985 IEEE International Conference on Robotics and Automation.

[21]  Michael Rohs,et al.  Marker-Based Embodied Interaction for Handheld Augmented Reality Games , 2007, J. Virtual Real. Broadcast..

[22]  Sang M. Lee,et al.  Assessing the Reliability, Validity and Adaptability of PSSUQ , 2005, AMCIS.

[23]  Pi-Chung Wang,et al.  Spatial aware location service for mobile ad hoc networks , 2009, 2009 IEEE 13th International Symposium on Consumer Electronics.

[24]  A. Olteanu,et al.  A marker-based augmented reality system for mobile devices , 2013, 2013 11th RoEduNet International Conference.

[25]  Tovi Grossman,et al.  PenLight: combining a mobile projector and a digital pen for dynamic visual overlay , 2009, CHI.

[26]  Susan Reiser,et al.  Using Arduino for introductory programming courses , 2009 .

[27]  Junqing Yu,et al.  Video Affective Content Recognition Based on Genetic Algorithm Combined HMM , 2007, ICEC.

[28]  Constantine Stephanidis,et al.  Augmented interaction with physical books in an Ambient Intelligence learning environment , 2013, Multimedia Tools and Applications.

[29]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[30]  Hua Lu,et al.  Distance-Aware Join for Indoor Moving Objects , 2015, IEEE Transactions on Knowledge and Data Engineering.

[31]  Tsutomu Terada,et al.  Ultrasound-based movement sensing, gesture-, and context-recognition , 2013, ISWC '13.

[32]  Hiroki Nishino A shape-free, designable 6-DoF marker tracking method for camera-based interaction in mobile environment , 2010, SIGGRAPH '10.

[33]  Chunyuan Liao,et al.  Evaluating and understanding the usability of a pen-based command system for interactive paper , 2012, TCHI.

[34]  Berna Erol,et al.  HOTPAPER: multimedia interaction with paper using mobile phones , 2008, ACM Multimedia.

[35]  Eva Eriksson,et al.  Mixed interaction space: designing for camera based interaction with mobile devices , 2005, CHI Extended Abstracts.

[36]  Johnny Chung Lee,et al.  Hacking the Nintendo Wii Remote , 2008, IEEE Pervasive Computing.

[37]  H Iseki,et al.  Augmented reality visualization system for intravascular neurosurgery. , 1998, Computer aided surgery : official journal of the International Society for Computer Aided Surgery.

[38]  Allison Druin,et al.  Proceedings of the 2004 conference on Interaction design and children: building a community , 2004 .

[39]  Leah Buechley,et al.  The LilyPad Arduino: using computational textiles to investigate engagement, aesthetics, and diversity in computer science education , 2008, CHI.

[40]  Steven F. Barrett Arduino Microcontroller Processing for Everyone , 2010 .

[41]  David A. Freedman,et al.  Statistical Models: Theory and Practice: References , 2005 .

[42]  James R. Lewis Psychometric Evaluation of the Post-Study System Usability Questionnaire: The PSSUQ , 1992 .

[43]  Eric Lecolinet,et al.  S-Notebook: augmenting mobile devices with interactive paper for data management , 2012, AVI.

[44]  Alec Wolman,et al.  Virtual Compass: Relative Positioning to Sense Mobile Social Interactions , 2010, Pervasive.

[45]  Jun Kong,et al.  PhoneLens: A Low-Cost, Spatially Aware, Mobile-Interaction Device , 2014, IEEE Transactions on Human-Machine Systems.

[46]  Michael Rohs,et al.  Visual code widgets for marker-based interaction , 2005, 25th IEEE International Conference on Distributed Computing Systems Workshops.