eat2pic: Food-tech Design as a Healthy Nudge with Smart Chopsticks and Canvas

In this paper, we introduce a food-tech design as a healthy nudge called eat2pic that encourages people to slow down their eating pace and encourage healthier eating habits. The eat2pic system is composed of a sensor-equipped chopstick (one of a pair) and components that use context-aware digital canvases. The eat2pic system achieves (1) automatic tracking of what and how fast the user consumed each mouthful of food, (2) real-time visual feedback that indicates good or bad eating behavior. The key concept of eat2pic involves extending the relationship between humans and paintings in order to establish a closed-loop in which daily eating behavior is reflected in pictures on the canvas near us and feedback from the canvas leads our lifestyles towards healthier diets. The eat2pic system provides a novel experience in which users can come to see eating as a playful task that involves coloring landscape pictures, and not simply taking nutrition into their bodies.

[1]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[2]  Rushil Khurana,et al.  FitByte: Automatic Diet Monitoring in Unconstrained Situations Using Multimodal Sensing on Eyeglasses , 2020, CHI.

[3]  James A. Landay,et al.  Designing Ambient Narrative-Based Interfaces to Reflect and Motivate Physical Activity , 2020, CHI.

[4]  Evangelos Karapanos,et al.  23 Ways to Nudge: A Review of Technology-Mediated Nudging in Human-Computer Interaction , 2019, CHI.

[5]  Paul Lukowicz,et al.  Monitoring Dietary Behavior with a Smart Dining Tray , 2015, IEEE Pervasive Computing.

[6]  M. Kreuter,et al.  Using Narrative Communication as a Tool for Health Behavior Change: A Conceptual, Theoretical, and Empirical Overview , 2007, Health education & behavior : the official publication of the Society for Public Health Education.

[7]  Karthik Desingh,et al.  Lessons Learned from Two Cohorts of Personal Informatics Self-Experiments , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[8]  Geoffrey L. Cohen,et al.  The psychology of change: self-affirmation and social psychological intervention. , 2014, Annual review of psychology.

[9]  Li Fei-Fei,et al.  ImageNet: A large-scale hierarchical image database , 2009, CVPR.

[10]  Ali Farhadi,et al.  YOLOv3: An Incremental Improvement , 2018, ArXiv.

[11]  Paul Lukowicz,et al.  Smart table surface: A novel approach to pervasive dining monitoring , 2015, 2015 IEEE International Conference on Pervasive Computing and Communications (PerCom).

[12]  B. J. Fogg,et al.  A behavior model for persuasive design , 2009, Persuasive '09.

[13]  Bart Vanrumste,et al.  Measuring and Localizing Individual Bites Using a Sensor Augmented Plate During Unrestricted Eating for the Aging Population , 2020, IEEE Journal of Biomedical and Health Informatics.

[14]  Oliver Amft,et al.  Monitoring Chewing and Eating in Free-Living Using Smart Eyeglasses , 2018, IEEE Journal of Biomedical and Health Informatics.

[15]  Samantha Kleinberg,et al.  Automated estimation of food type and amount consumed from body-worn audio and motion sensors , 2016, UbiComp.