Surround-see: enabling peripheral vision on smartphones during active use

Mobile devices are endowed with significant sensing capabilities. However, their ability to 'see' their surroundings, during active use, is limited. We present Surround-See, a self-contained smartphone equipped with an omni-directional camera that enables peripheral vision around the device to augment daily mobile tasks. Surround-See provides mobile devices with a field-of-view collinear to the device screen. This capability facilitates novel mobile tasks such as, pointing at objects in the environment to interact with content, operating the mobile device at a physical distance and allowing the device to detect user activity, even when the user is not holding it. We describe Surround-See's architecture, and demonstrate applications that exploit peripheral 'seeing' capabilities during active use of a mobile device. Users confirm the value of embedding peripheral vision capabilities on mobile devices and offer insights for novel usage methods.

[1]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[2]  Johannes Schöning,et al.  Carpus: a non-intrusive user identification technique for interactive surfaces , 2012, UIST '12.

[3]  Jesper Kjeldskov,et al.  Just-for-us: a context-aware mobile information system facilitating sociality , 2005, Mobile HCI.

[4]  Jacob O. Wobbrock,et al.  WalkType: using accelerometer data to accomodate situational impairments in mobile touch screen text entry , 2012, CHI.

[5]  Nikos Papamarkos,et al.  Hand gesture recognition using a neural network shape fitting technique , 2009, Eng. Appl. Artif. Intell..

[6]  Sriram Subramanian,et al.  GesText: accelerometer-based gestural text-entry systems , 2010, CHI.

[7]  Laurent Delahoche,et al.  Mobile robot localization based on an omnidirectional stereoscopic vision perception system , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[8]  Dieter Schmalstieg,et al.  First steps towards handheld augmented reality , 2003, Seventh IEEE International Symposium on Wearable Computers, 2003. Proceedings..

[9]  Shahram Izadi,et al.  SideSight: multi-"touch" interaction around small devices , 2008, UIST '08.

[10]  Gary R. Bradski,et al.  ORB: An efficient alternative to SIFT or SURF , 2011, 2011 International Conference on Computer Vision.

[11]  Nikolaos G. Bourbakis,et al.  A survey of skin-color modeling and detection methods , 2007, Pattern Recognit..

[12]  Emiliano Miluzzo,et al.  A survey of mobile phone sensing , 2010, IEEE Communications Magazine.

[13]  Saul Greenberg,et al.  Proxemic interaction: designing for a proximity and orientation-aware environment , 2010, ITS '10.

[14]  William Buxton,et al.  Chunking and Phrasing and the Design of Human-Computer Dialogues (Invited Paper) , 1995, IFIP Congress.

[15]  Ben J. A. Kröse,et al.  Robust scene reconstruction from an omnidirectional vision system , 2003, IEEE Trans. Robotics Autom..

[16]  Khai N. Truong,et al.  Determining the orientation of proximate mobile devices using their back facing camera , 2012, CHI.

[17]  Juliana Sutanto,et al.  CAMB: context-aware mobile browser , 2010, MUM.

[18]  Gunnar Farnebäck,et al.  Two-Frame Motion Estimation Based on Polynomial Expansion , 2003, SCIA.

[19]  Manolis I. A. Lourakis,et al.  Vision-Based Interpretation of Hand Gestures for Remote Control of a Computer Mouse , 2006, ECCV Workshop on HCI.

[20]  P. KaewTrakulPong,et al.  An Improved Adaptive Background Mixture Model for Real-time Tracking with Shadow Detection , 2002 .

[21]  Stephen A. Brewster,et al.  Usable gestures for mobile interfaces: evaluating social acceptability , 2010, CHI.

[22]  Marko Heikkilä,et al.  Description of interest regions with local binary patterns , 2009, Pattern Recognit..

[23]  Xing-Dong Yang,et al.  Magic finger: always-available input through finger instrumentation , 2012, UIST.

[24]  Shree K. Nayar,et al.  Ego-motion and omnidirectional cameras , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[25]  Mircea Nicolescu,et al.  Vision-based hand pose estimation: A review , 2007, Comput. Vis. Image Underst..

[26]  Volker Paelke,et al.  Foot-based mobile interaction with games , 2004, ACE '04.

[27]  Carl Gutwin,et al.  Improving selection of off-screen targets with hopping , 2006, CHI.

[28]  Jong-Il Park,et al.  One-handed interaction with augmented virtual objects on mobile devices , 2008, VRCAI.

[29]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.

[30]  Bing-Yu Chen,et al.  GaussSense: attachable stylus sensing using magnetic sensor grid , 2012, UIST '12.

[31]  Parisa Eslambolchilar,et al.  Tilt-Based Automatic Zooming and Scaling in Mobile Devices - A State-Space Implementation , 2004, Mobile HCI.

[32]  Osama Masoud,et al.  A method for human action recognition , 2003, Image Vis. Comput..

[33]  Lhoussaine Masmoudi,et al.  An omnidirectional image unwrapping approach , 2011, 2011 International Conference on Multimedia Computing and Systems.

[34]  Mike Sinclair,et al.  Touch-sensing input devices , 1999, CHI '99.

[35]  Andrew D. Wilson Robust computer vision-based detection of pinching for one and two-handed gesture input , 2006, UIST.

[36]  Yang Li,et al.  Protractor: a fast and accurate gesture recognizer , 2010, CHI.

[37]  Eva Eriksson,et al.  Mixed interaction space: designing for camera based interaction with mobile devices , 2005, CHI Extended Abstracts.

[38]  Pourang Irani,et al.  Ad-binning: leveraging around device space for storing, browsing and retrieving mobile device content , 2013, CHI.

[39]  Rafiqul Zaman Khan,et al.  Survey on Gesture Recognition for Hand Image Postures , 2012, Comput. Inf. Sci..

[40]  Bill N. Schilit,et al.  Context-aware computing applications , 1994, Workshop on Mobile Computing Systems and Applications.

[41]  Yuna Kang,et al.  Improvement of smartphone interface using an AR marker , 2012, VRCAI '12.

[42]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[43]  Pourang Irani,et al.  CrashAlert: enhancing peripheral alertness for eyes-busy mobile interaction while walking , 2013, CHI.

[44]  Geehyuk Lee,et al.  ISeeU: camera-based user interface for a handheld computer , 2005, Mobile HCI.

[45]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[46]  Mike Y. Chen,et al.  iRotate: automatic screen rotation based on face orientation , 2012, CHI.

[47]  Joachim Pouderoux,et al.  A camera-based interface for interaction with mobile handheld computers , 2005, I3D '05.

[48]  Chris Harrison,et al.  Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices , 2009, UIST '09.

[49]  Yasuo Suga,et al.  An Omnidirectional Vision-Based Moving Obstacle Detection in Mobile Robot , 2007 .

[50]  Pierre Dragicevic,et al.  Strategies for accelerating on-line learning of hotkeys , 2007, CHI.

[51]  Matti Pietikäinen,et al.  Multiresolution Gray-Scale and Rotation Invariant Texture Classification with Local Binary Patterns , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[52]  John F. Canny,et al.  TinyMotion: camera phone based interaction methods , 2006, CHI EA '06.

[53]  Michael Rohs,et al.  Hoverflow: exploring around-device interaction with IR distance sensors , 2009, Mobile HCI.