Innovative wearable interfaces: an exploratory analysis of paper-based interfaces with camera-glasses device unit

The new ubiquitous interaction methods change people’s life and facilitate their tasks in everyday life and in the workplace, enabling people to access their personal data as well as public resources at any time and in any place. We found two solutions to enable ubiquitous interaction and put a stop to the limits imposed by the desktop mode: namely nomadism and mobility. Based on these two solutions, we have proposed three interfaces (Zhou et al. in HCI international 2011: human–computer interaction. Interaction techniques and environments, Springer, Berlin, pp 500–509, 2011): in-environment interface (IEI), environment dependent interface (EDI), and environment independent interface (EII). In this paper, we first discuss an overview of IEI, EDI, and EII, before excluding IEI and focusing on EDI and EII, their background, and distinct characteristics. We also propose a continuum from physical paper-based interface to digital projected interface in relation with EDI and EII. Then, to validate EDI and EII concepts, we design and implement a MobilePaperAccess system, which is a wearable camera-glasses system with paper-based interface and original input techniques allowing mobile interaction. Furthermore, we discuss the evaluation of the MobilePaperAccess system; we compare two interfaces (EDI and EII) and three input techniques (finger input, mask input, and page input) to test the feasibility and usability of this system. Both the quantitative and qualitative results are reported and discussed. Finally, we provide the prospects and our future work for improving the current approaches.

[1]  Kenneth P. Fishkin,et al.  A taxonomy for and analysis of tangible interfaces , 2004, Personal and Ubiquitous Computing.

[2]  Ivan Poupyrev,et al.  Motionbeam: a metaphor for character interaction with handheld projectors , 2011, CHI.

[3]  D. A. Grant The latin square principle in the design and analysis of psychological experiments. , 1948, Psychological bulletin.

[4]  Saran Chari,et al.  Wearable Computing and Augmented Reality , 1999 .

[5]  Andries van Dam,et al.  Post-WIMP user interfaces , 1997, CACM.

[6]  Lars Kulik,et al.  Gesture recognition using RFID technology , 2012, Personal and Ubiquitous Computing.

[7]  Hiroshi Ishii,et al.  Bricks: laying the foundations for graspable user interfaces , 1995, CHI '95.

[8]  R. Likert “Technique for the Measurement of Attitudes, A” , 2022, The SAGE Encyclopedia of Research Design.

[9]  Jovan Popovic,et al.  Real-time hand-tracking with a color glove , 2009, SIGGRAPH '09.

[10]  Eva Hornecker,et al.  Using ARToolKit Markers to Build Tangible Prototypes and Simulate Other Technologies , 2005, INTERACT.

[11]  Hiroshi Ishii,et al.  Tangible bits: beyond pixels , 2008, TEI.

[12]  Christophe Kolski,et al.  RFID-driven situation awareness on TangiSense, a table interacting with tangible objects , 2011, Personal and Ubiquitous Computing.

[13]  Matt Jones,et al.  Pico-ing into the future of mobile projection and contexts , 2011, Personal and Ubiquitous Computing.

[14]  Michael Rohs,et al.  The smart phone: a ubiquitous input device , 2006, IEEE Pervasive Computing.

[15]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[16]  Chris Harrison,et al.  Minput: enabling interaction on small mobile devices with high-precision, low-cost, multipoint optical tracking , 2010, CHI.

[17]  Jun Rekimoto,et al.  Brainy hand: an ear-worn hand gesture interaction device , 2009, CHI Extended Abstracts.

[18]  Bertrand David,et al.  IMERA: Experimentation Platform for Computer Augmented Environment for Mobile Actors , 2007, Third IEEE International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob 2007).

[19]  James H. Aylor,et al.  Computer for the 21st Century , 1999, Computer.

[20]  Yun Zhou,et al.  Innovative User Interfaces for Wearable Computers in Real Augmented Environment , 2011, HCI.

[21]  Roel Vertegaal,et al.  DisplayObjects: prototyping functional physical interfaces on 3d styrofoam, paper or cardboard models , 2010, TEI '10.

[22]  Gerard Jounghyun Kim,et al.  Usability of one-handed interaction methods for handheld projection-based augmented reality , 2013, Personal and Ubiquitous Computing.

[23]  Hans-Werner Gellersen,et al.  Personal Projectors for Pervasive Computing , 2012, IEEE Pervasive Computing.

[24]  Contextual Qr Codes,et al.  The Third International Multi-Conference on Computing in the Global Information Technology , 2008 .

[25]  William Buxton,et al.  A three-state model of graphical input , 1990, INTERACT.

[26]  Karl D. D. Willis A pre-history of handheld projector-based interaction , 2011, Personal and Ubiquitous Computing.

[27]  Desney S. Tan,et al.  Emerging Input Technologies for Always-Available Mobile Interaction , 2011, Found. Trends Hum. Comput. Interact..

[28]  Hao Tang,et al.  FACT: fine-grained cross-media interaction with documents via a portable hybrid paper-laptop interface , 2010, ACM Multimedia.

[29]  Nikolaus F. Troje,et al.  Paper windows: interaction techniques for digital paper , 2005, CHI.

[30]  Gary R. Bradski,et al.  Real time face and object tracking as a component of a perceptual user interface , 1998, Proceedings Fourth IEEE Workshop on Applications of Computer Vision. WACV'98 (Cat. No.98EX201).

[31]  Kent Lyons,et al.  Twiddler typing: one-handed chording text entry for mobile phones , 2004, CHI.

[32]  Jun Rekimoto,et al.  CyberCode: designing augmented reality environments with visual tags , 2000, DARE '00.

[33]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[34]  Gary Bradski,et al.  Computer Vision Face Tracking For Use in a Perceptual User Interface , 1998 .

[35]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[36]  Yun Zhou,et al.  Mobile User Interfaces and their Utilization in a Smart City , 2011 .

[37]  Kent Lyons,et al.  Experimental Evaluations of the Twiddler One-Handed Chording Mobile Keyboard , 2006, Hum. Comput. Interact..

[38]  I.,et al.  Fitts' Law as a Research and Design Tool in Human-Computer Interaction , 1992, Hum. Comput. Interact..

[39]  Noa M. Rensing,et al.  Eyeglass-based systems for wearable computing , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[40]  J. Rouillard,et al.  Contextual QR Codes , 2008, 2008 The Third International Multi-Conference on Computing in the Global Information Technology (iccgi 2008).

[41]  Yonggang Ha,et al.  Optical assessment of head-mounted displays in visual space. , 2002, Applied optics.

[42]  Pattie Maes,et al.  WUW - wear Ur world: a wearable gestural interface , 2009, CHI Extended Abstracts.

[43]  Patrick Baudisch,et al.  Disappearing mobile devices , 2009, UIST '09.

[44]  Pattie Maes,et al.  Quickies : intelligent sticky notes , 2008 .