On-Skin Interaction Using Body Landmarks

The human skin is a promising surface for input to computing devices but differs fundamentally from existing touch-sensitive devices. The authors propose the use of skin landmarks, which offer unique tactile and visual cues, to enhance body-based user interfaces.

[1]  S. Lele,et al.  Euclidean distance matrix analysis: a coordinate-free approach for comparing biological shapes using landmark data. , 1991, American journal of physical anthropology.

[2]  Jürgen Steimle,et al.  More than touch: understanding how people use skin as an input surface for mobile computing , 2014, CHI.

[3]  Alex Olwal,et al.  SkinMarks: Enabling Interactions on Body Landmarks Using Conformal Skin Electronics , 2017, CHI.

[4]  Pedro Lopes,et al.  Proprioceptive Interaction , 2015, CHI.

[5]  Jürgen Steimle,et al.  Skin--The Next User Interface , 2016, Computer.

[6]  Tong Lu,et al.  iSkin: Flexible, Stretchable and Visually Customizable On-Body Touch Sensors for Mobile Computing , 2015, CHI.

[7]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[8]  Jürgen Steimle,et al.  PrintScreen: fabricating highly customizable thin-film touch-displays , 2014, UIST.

[9]  Joanna Bergstrom-Lehtovirta,et al.  Placing and Recalling Virtual Items on the Skin , 2017, CHI.

[10]  Gierad Laput,et al.  SkinTrack: Using the Body as an Electrical Waveguide for Continuous Finger Tracking on the Skin , 2016, CHI.

[11]  Eric Paulos,et al.  Skintillates: Designing and Creating Epidermal Interactions , 2016, Conference on Designing Interactive Systems.

[12]  Patrick Baudisch,et al.  Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing , 2013, CHI.

[13]  Da-Yuan Huang,et al.  DigitSpace: Designing Thumb-to-Fingers Touch Interfaces for One-Handed and Eyes-Free Interactions , 2016, CHI.