Use Your Head! Exploring Interaction Modalities for Hat Technologies

As our landscape of wearable technologies proliferates, we find more devices situated on our heads. However, many challenges hinder them from widespread adoption - from their awkward, bulky form factor (today's AR and VR goggles) to their socially stigmatized designs (Google Glass) and a lack of a well-developed head-based interaction design language. In this paper, we explore a socially acceptable, large, head-worn interactive wearable - a hat. We report results from a gesture elicitation study with 17 participants, extract a taxonomy of gestures, and define a set of design concerns for interactive hats. Through this lens, we detail the design and fabrication of three hat prototypes capable of sensing touch, head movements, and gestures, and including ambient displays of several types. Finally, we report an evaluation of our hat prototype and insights to inform the design of future hat technologies.

[1]  Nicolai Marquardt,et al.  Gesture Elicitation Study on How to Opt-in & Opt-out from Interactions with Public Displays , 2017, ISS.

[2]  Bongshin Lee,et al.  Reducing legacy bias in gesture elicitation studies , 2014, INTR.

[3]  Yulia Silina,et al.  New directions in jewelry: a close look at emerging trends & developments in jewelry-like wearable devices , 2015, SEMWEB.

[4]  Ivan Poupyrev,et al.  Biosignals as Social Cues: Ambiguity and Emotional Interpretation in Social Displays of Skin Conductance , 2016, Conference on Designing Interactive Systems.

[5]  Cheng Yao,et al.  LightingHair Slice: Situated Personal Wearable Fashion Interaction System , 2017, CHI Extended Abstracts.

[6]  Cameron S. Miner,et al.  Digital jewelry: wearable technology for everyday life , 2001, CHI Extended Abstracts.

[7]  Mary Czerwinski,et al.  Lightwear: An Exploration in Wearable Light Therapy , 2015, Tangible and Embedded Interaction.

[8]  Kun-Pyo Lee,et al.  Wearable-object-based interaction for a mobile audio device , 2010, CHI EA '10.

[9]  Asta Roseway,et al.  EarthTones: Chemical Sensing Powders to Detect and Display Environmental Hazards through Color Variation , 2017, CHI Extended Abstracts.

[10]  Kimiko Ryokai,et al.  Tensions of Data-Driven Reflection: A Case Study of Real-Time Emotional Biosensing , 2018, CHI.

[11]  Hugo Fuks,et al.  Hairware: The Conscious Use of Unconscious Auto-contact Behaviors , 2015, IUI.

[12]  Eric Paulos,et al.  AlterWear: Battery-Free Wearable Displays for Opportunistic Interactions , 2018, CHI.

[13]  D. Norman The Design of Everyday Things: Revised and Expanded Edition , 2013 .

[14]  Pattie Maes,et al.  WUW - wear Ur world: a wearable gestural interface , 2009, CHI Extended Abstracts.

[15]  Ivan E. Sutherland,et al.  A head-mounted three dimensional display , 1968, AFIPS Fall Joint Computing Conference.

[16]  Pattie Maes,et al.  AlterEgo: A Personalized Wearable Silent Speech Interface , 2018, IUI.

[17]  James A. Landay,et al.  Drone & me: an exploration into natural human-drone interaction , 2015, UbiComp.

[18]  Teddy Seyed,et al.  Eliciting usable gestures for multi-display environments , 2012, ITS.

[19]  Florian Mueller,et al.  Lumahelm: an interactive helmet , 2013, CHI Extended Abstracts.

[20]  Christina Boucher,et al.  Exploring Non-touchscreen Gestures for Smartwatches , 2016, CHI.

[21]  Jonathan Randall Howarth,et al.  Cultural similarities and differences in user-defined gestures for touchscreen user interfaces , 2010, CHI EA '10.

[22]  Jennifer Pearson,et al.  It's About Time: Smartwatches as Public Displays , 2015, CHI.

[23]  Kasper Hornbæk,et al.  zPatch: Hybrid Resistive/Capacitive eTextile Input , 2018, TEI.

[24]  Michael Rohs,et al.  User-defined gestures for connecting mobile phones, public displays, and tabletops , 2010, Mobile HCI.

[25]  Eric Paulos,et al.  HäirIÖ: Human Hair as Interactive Material , 2018, Tangible and Embedded Interaction.

[26]  Björn Hartmann,et al.  HindSight: Enhancing Spatial Awareness by Sonifying Detected Objects in Real-Time 360-Degree Video , 2018, CHI.

[27]  Hugo Fuks,et al.  Blinklifier: A Case Study for Prototyping Wearable Computers in Technology and Visual Arts , 2013, HCI.

[28]  Teddy Seyed,et al.  User Elicitation on Single-hand Microgestures , 2016, CHI.

[29]  Sean Follmer,et al.  Exploring Interactions and Perceptions of Kinetic Wearables , 2017, Conference on Designing Interactive Systems.

[30]  Pattie Maes,et al.  SixthSense: a wearable gestural interface , 2009, SIGGRAPH ASIA Art Gallery & Emerging Technologies.

[31]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[32]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[33]  Radu-Daniel Vatavu,et al.  Leap gestures for TV: insights from an elicitation study , 2014, TVX.

[34]  Marissa Ellermann Hats and Headwear from around the World: A Cultural Encyclopedia , 2014 .

[35]  Qun Li,et al.  GlassGesture: Exploring head gesture interface of smart glasses , 2016, 2016 IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS).

[36]  Joseph A. Paradiso,et al.  ChromoSkin: Towards Interactive Cosmetics Using Thermochromic Pigments , 2016, CHI Extended Abstracts.

[37]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[38]  Hugo Fuks,et al.  FX e-Makeup for Muscle Based Interaction , 2014, HCI.

[39]  Sean White,et al.  Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring , 2011, CHI.

[40]  Ivan Poupyrev,et al.  "I don't Want to Wear a Screen": Probing Perceptions of and Possibilities for Dynamic Displays on Clothing , 2016, CHI.

[41]  Ivan Poupyrev,et al.  Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects , 2012, CHI.

[42]  Anind K. Dey,et al.  Heuristic evaluation of ambient displays , 2003, CHI '03.

[43]  Andrea Lockerd Thomaz,et al.  Eye-R, a glasses-mounted eye motion detection interface , 2001, CHI Extended Abstracts.