NanoStylus: Enhancing Input on Ultra-Small Displays with a Finger-Mounted Stylus

Due to their limited input area, ultra-small devices, such as smartwatches, are even more prone to occlusion or the fat finger problem, than their larger counterparts, such as smart phones, tablets, and tabletop displays. We present NanoStylus -- a finger-mounted fine-tip stylus that enables fast and accurate pointing on a smartwatch with almost no occlusion. The NanoStylus is built from the circuitry of an active capacitive stylus, and mounted within a custom 3D-printed thimble-shaped housing unit. A sensor strip is mounted on each side of the device to enable additional gestures. A user study shows that NanoStylus reduces error rate by 80%, compared to traditional touch interaction and by 45%, compared to a traditional stylus. This high precision pointing capability, coupled with the implemented gesture sensing, gives us the opportunity to explore a rich set of interactive applications on a smartwatch form factor.

[1]  Mark W. Newman,et al.  Escape: a target selection technique using visually-cued gestures , 2008, CHI.

[2]  Chris Harrison,et al.  TapSense: enhancing finger interaction on touch surfaces , 2011, UIST.

[3]  Xing-Dong Yang,et al.  Magic finger: always-available input through finger instrumentation , 2012, UIST.

[4]  Yvonne Rogers,et al.  Fat Finger Worries: How Older and Younger Users Physically Interact with PDAs , 2005, INTERACT.

[5]  Olivier Chapuis,et al.  Effects of motor scale, visual scale, and quantization on small target acquisition difficulty , 2011, TCHI.

[6]  Daniel Vogel,et al.  Occlusion-aware interfaces , 2010, CHI.

[7]  Sean White,et al.  Facet: a multi-segment wrist worn system , 2012, UIST.

[8]  Patrick Baudisch,et al.  Hover widgets: using the tracking state to extend the capabilities of pen-operated devices , 2006, CHI.

[9]  Sachi Mizobuchi,et al.  Investigating the Usability of the Stylus Pen on Handheld Devices , 2005 .

[10]  Gierad Laput,et al.  Expanding the input expressivity of smartwatches with mechanical pan, twist, tilt and click , 2014, CHI.

[11]  Xiaojun Bi,et al.  An exploration of pen rolling for pen-based interaction , 2008, UIST '08.

[12]  Abigail Sellen,et al.  A comparison of input devices in element pointing and dragging tasks , 1991, CHI.

[13]  Hirotaka Osawa,et al.  iRing: intelligent ring using infrared reflection , 2012, UIST.

[14]  Xiang Cao,et al.  Grips and gestures on a multi-touch pen , 2011, CHI.

[15]  Walter F. Bischof,et al.  Hands, hover, and nibs: understanding stylus accuracy on tablets , 2015, Graphics Interface.

[16]  William Buxton,et al.  Pen + touch = new tools , 2010, UIST.

[17]  Kent Lyons,et al.  The Gesture Watch: A Wireless Contact-free Gesture based Wrist Interface , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[18]  Patrick Baudisch,et al.  Back-of-device interaction allows creating very small touch devices , 2009, CHI.

[19]  Ravin Balakrishnan,et al.  Pressure widgets , 2004, CHI.

[20]  Ian Oakley,et al.  Interaction on the edge: offset sensing for small devices , 2014, CHI.

[21]  Sean White,et al.  Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring , 2011, CHI.

[22]  Xiang 'Anthony' Chen,et al.  Swipeboard: a text entry technique for ultra-small interfaces that supports novice to expert transitions , 2014, UIST.

[23]  Daniel Vogel,et al.  Shift: a technique for operating pen-based interfaces using touch , 2007, CHI.

[24]  Ken Hinckley,et al.  LightRing: always-available 2D input on any surface , 2014, UIST.

[25]  Amy Ogan,et al.  ZoomBoard: a diminutive qwerty soft keyboard using iterative zooming for ultra-small devices , 2013, CHI.

[26]  Fabrice Matulic,et al.  Sensing techniques for tablet+stylus interaction , 2014, UIST.

[27]  Shumin Zhai,et al.  High precision touch screen interaction , 2003, CHI '03.

[28]  Daniel Vogel,et al.  Hand occlusion with tablet-sized direct pen input , 2009, CHI.

[29]  Carl Gutwin,et al.  Understanding performance in touch selections: Tap, drag and radial pointing drag with finger, stylus and mouse , 2012, Int. J. Hum. Comput. Stud..

[30]  Li-Wei Chan,et al.  NailDisplay: bringing an always available visual display to fingertips , 2013, CHI.

[31]  Chris Harrison,et al.  Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices , 2009, UIST '09.

[32]  Sean White,et al.  uTrack: 3D input using two magnetic sensors , 2013, UIST.

[33]  Wen-Huang Cheng,et al.  FingerPad: private and subtle interaction using fingertips , 2013, UIST.

[34]  D. Chaffin,et al.  An investigation of fitts' law using a wide range of movement amplitudes. , 1976, Journal of motor behavior.

[35]  Patrick Baudisch,et al.  Precise selection techniques for multi-touch screens , 2006, CHI.

[36]  Patrick Baudisch,et al.  The generalized perceived input point model and how to double touch accuracy by extracting fingerprints , 2010, CHI.

[37]  Xiangshi Ren,et al.  Improving selection performance on pen-based systems: a study of pen-based interaction for selection tasks , 2000, TCHI.

[38]  Hongan Wang,et al.  Tilt menu: using the 3D orientation information of pen devices to extend the selection capability of pen-based user interfaces , 2008, CHI.

[39]  Xiang 'Anthony' Chen,et al.  Skin buttons: cheap, small, low-powered and clickable fixed-icon laser projectors , 2014, UIST.

[40]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[41]  Shumin Zhai,et al.  FFitts law: modeling finger touch with fitts' law , 2013, CHI.

[42]  Kent Lyons,et al.  Quickdraw: the impact of mobility and on-body placement on device access time , 2008, CHI.

[43]  CockburnA.,et al.  Understanding performance in touch selections , 2012 .