Unmasked

Due to the COVID-19 pandemic, wearing a mask to cover one's mouth is recommended in public spaces to prevent the spread of the virus. Wearing masks hinders our ability to express ourselves, as it is hard to read facial expressions much less lips behind a mask. We present Unmasked, an expressive interface using lip tracking to enhance communication while wearing a mask. Unmasked uses three different methods - either accelerometers, LEDs with a camera tracking, or streaming video - to make speaking while wearing a mask more expressive. Unmasked aims to improve communication during conversations while wearing a mask. This device will help people express themselves while wearing a mask by tracking their mouth movements and displaying their facial expressions on an LCD mounted on the front of the mask. By enhancing communication while wearing a mask, this prototype makes social distancing less disruptive and more bearable, metaphorically closing some of the distance between us.

[1]  P. Ekman Are there basic emotions? , 1992, Psychological review.

[2]  Dongheng Li,et al.  openEyes: a low-cost head-mounted eye-tracking solution , 2006, ETRA.

[3]  Jürgen Steimle,et al.  Soft Inkjet Circuits: Rapid Multi-Material Fabrication of Soft Circuits using a Commodity Inkjet Printer , 2019, UIST.

[4]  Yuta Sugiura,et al.  SenSkin: adapting skin as a soft interface , 2013, UIST.

[5]  Shwetak N. Patel,et al.  Whole-home gesture recognition using wireless signals , 2013, MobiCom.

[6]  Tong Lu,et al.  iSkin: Flexible, Stretchable and Visually Customizable On-Body Touch Sensors for Mobile Computing , 2015, CHI.

[7]  Wen-Huang Cheng,et al.  FingerPad: private and subtle interaction using fingertips , 2013, UIST.

[8]  Joon Son Chung,et al.  Lip Reading Sentences in the Wild , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[9]  Jeff B. Pelz,et al.  Building a lightweight eyetracking headgear , 2004, ETRA.

[10]  Hans-Werner Gellersen,et al.  Toward Mobile Eye-Based Human-Computer Interaction , 2010, IEEE Pervasive Computing.

[11]  Christian Holz,et al.  DuoSkin: rapidly prototyping on-skin user interfaces using skin-friendly materials , 2016, SEMWEB.

[12]  Sharman Jagadeesan,et al.  OMG!: a new robust, wearable and affordable open source mobile gaze tracker , 2013, MobileHCI '13.

[13]  Martin Wegrzyn,et al.  Mapping the emotional face. How individual face parts contribute to successful emotion recognition , 2017, PloS one.