Data@Hand: Fostering Visual Exploration of Personal Data on Smartphones Leveraging Speech and Touch Interaction

Most mobile health apps employ data visualization to help people view their health and activity data, but these apps provide limited support for visual data exploration. Furthermore, despite its huge ∗Arjun Srinivasan conducted this work while with Georgia Institute of Technology. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from permissions@acm.org. CHI ’21, May 8–13, 2021, Yokohama, Japan © 2021 Copyright held by the owner/author(s). Publication rights licensed to ACM. ACM ISBN 978-1-4503-8096-6/21/05. . . $15.00 https://doi.org/10.1145/3411764.3445421 potential benefits, mobile visualization research in the personal data context is sparse. This work aims to empower people to easily navigate and compare their personal health data on smartphones by enabling flexible time manipulation with speech. We designed and developed Data@Hand, a mobile app that leverages the synergy of two complementary modalities: speech and touch. Through an exploratory study with 13 long-term Fitbit users, we examined how multimodal interaction helps participants explore their own health data. Participants successfully adopted multimodal interaction (i.e., speech and touch) for convenient and fluid data exploration. Based on the quantitative and qualitative findings, we discuss design implications and opportunities with multimodal interaction for better supporting visual data exploration on mobile devices. ar X iv :2 10 1. 06 28 3v 1 [ cs .H C ] 1 5 Ja n 20 21 CHI ’21, May 8–13, 2021, Yokohama, Japan Young-Ho Kim, Bongshin Lee, Arjun Srinivasan, and Eun Kyoung Choe

[1]  Vidya Setlur,et al.  Applying Pragmatics Principles for Interaction with Visual Analytics , 2018, IEEE Transactions on Visualization and Computer Graphics.

[2]  Vidya Setlur,et al.  Inferencing underspecified natural language utterances in visual analysis , 2019, IUI.

[3]  Sunny Consolvo,et al.  Lullaby: a capture & access system for understanding the sleep environment , 2012, UbiComp.

[4]  Philip R. Cohen,et al.  Synergistic use of direct manipulation and natural language , 1989, CHI '89.

[5]  Bongshin Lee,et al.  Data Visualization on Mobile Devices , 2018, CHI Extended Abstracts.

[6]  Sean A. Munson,et al.  DreamCatcher: Exploring How Parents and School-Age Children can Track and Review Sleep Information Together , 2020, Proc. ACM Hum. Comput. Interact..

[7]  M. Sheelagh T. Carpendale,et al.  Personal Visualization and Personal Visual Analytics , 2015, IEEE Transactions on Visualization and Computer Graphics.

[8]  John T. Stasko,et al.  Orko: Facilitating Multimodal Interaction for Visual Exploration and Analysis of Networks , 2018, IEEE Transactions on Visualization and Computer Graphics.

[9]  Mukund Sundararajan,et al.  Analyza: Exploring Data with Conversation , 2017, IUI.

[10]  Sean A. Munson,et al.  Taming data complexity in lifelogs: exploring visual cuts of personal informatics data , 2014, Conference on Designing Interactive Systems.

[11]  Jean-Claude Martin,et al.  TYCOON: Theoretical Framework and Software Tools for Multimodal Interfaces , 1997 .

[12]  Vidya Setlur,et al.  Eviza: A Natural Language Interface for Visual Analysis , 2016, UIST.

[13]  Bongshin Lee,et al.  Glanceable Visualization: Studies of Data Comparison Performance on Smartwatches , 2019, IEEE Transactions on Visualization and Computer Graphics.

[14]  Benjamin Watson,et al.  Emerging Research in Mobile Visualization , 2015, MobileHCI Adjunct.

[15]  Yang Chen,et al.  Visualizing Large Time-series Data on Very Small Screens , 2017, EuroVis.

[16]  Seung-won Hwang,et al.  Personal Data Exploration with Speech on Mobile Devices , 2018 .

[17]  Matthew Turk,et al.  Multimodal interaction: A review , 2014, Pattern Recognit. Lett..

[18]  John T. Stasko,et al.  Multimodal interaction for data visualization , 2018, AVI.

[19]  Barbara Di Eugenio,et al.  Multimodal Coreference Resolution for Exploratory Data Visualization Dialogue: Context-Based Annotation and Gesture Identification , 2017 .

[20]  Yiwen Sun,et al.  Articulate: A Semi-automated Model for Translating Natural Language Queries into Meaningful Visualizations , 2010, Smart Graphics.

[21]  Ann Blandford,et al.  Four easy pieces for assessing the usability of multimodal interaction: the CARE properties , 1995, INTERACT.

[22]  Charles Perin,et al.  Activity River: Visualizing Planned and Logged Personal Activities for Reflection , 2020, AVI.

[23]  Bongshin Lee,et al.  Understanding self-reflection: how people reflect on personal data through visual data exploration , 2017, PervasiveHealth.

[24]  Andreas Butz,et al.  Communicating Uncertainty in Fertility Prognosis , 2019, CHI.

[25]  Wanda Pratt,et al.  SleepTight: low-burden, self-monitoring technology for capturing and reflecting on sleep behaviors , 2015, UbiComp.

[26]  Ben Shneiderman,et al.  Visual information seeking: tight coupling of dynamic query filters with starfield displays , 1994, CHI '94.

[27]  Michael Rohs,et al.  Talk to Me Intelligibly: Investigating An Answer Space to Match the User's Language in Visual Analysis , 2019, Conference on Designing Interactive Systems.

[28]  M. Sheelagh T. Carpendale,et al.  Visualization Viewpoints , 2002 .

[29]  Kim-Phuong L. Vu,et al.  Privacy Concerns for Use of Voice Activated Personal Assistant in the Public Space , 2015, Int. J. Hum. Comput. Interact..

[30]  Jodi Forlizzi,et al.  Understanding my data, myself: supporting self-reflection with ubicomp technologies , 2011, UbiComp '11.

[31]  Bongshin Lee,et al.  A Comparative Evaluation of Animation and Small Multiples for Trend Visualization on Mobile Phones , 2019, IEEE Transactions on Visualization and Computer Graphics.

[32]  Bowen Yu,et al.  FlowSense: A Natural Language Interface for Visual Data Exploration within a Dataflow System , 2019, IEEE Transactions on Visualization and Computer Graphics.

[33]  Parisa Eslambolchilar,et al.  A Taxonomy for Visualisations of Personal Physical Activity Data on Self-Tracking Devices and their Applications , 2018 .

[34]  Bongshin Lee,et al.  Characterizing Visualization Insights from Quantified Selfers' Personal Data Presentations , 2015, IEEE Computer Graphics and Applications.

[35]  Mira Dontcheva,et al.  Vocal Shortcuts for Creative Experts , 2019, CHI.

[36]  Philipp Eichmann,et al.  Orchard: Exploring Multivariate Heterogeneous Networks on Mobile Phones , 2020, Comput. Graph. Forum.

[37]  Jillian Aurisano Articulate 2 : Toward a Conversational Interface for Visual Data Exploration , 2016 .

[38]  Rebecca E. Grinter,et al.  A Multi-Modal Natural Language Interface to an Information Visualization Environment , 2001, Int. J. Speech Technol..

[39]  Lena Mamykina,et al.  Pictures Worth a Thousand Words: Reflections on Visualizing Personal Blood Glucose Forecasts for Individuals with Type 2 Diabetes , 2018, CHI.

[40]  Bongshin Lee,et al.  Visualizing Ranges over Time on Mobile Phones: A Task-Based Crowdsourced Evaluation , 2019, IEEE Transactions on Visualization and Computer Graphics.

[41]  Melanie Tory,et al.  A Field Study of On-Calendar Visualizations , 2016, Graphics Interface.

[42]  Sean A. Munson,et al.  When (ish) is My Bus?: User-centered Visualizations of Uncertainty in Everyday, Mobile Predictive Systems , 2016, CHI.

[43]  A. Waibel,et al.  MULTIMODAL HUMAN-COMPUTER INTERACTION , 1993 .

[44]  John Stasko,et al.  Touch? Speech? or Touch and Speech? Investigating Multimodal Interaction for Visual Network Exploration and Analysis , 2020, IEEE Transactions on Visualization and Computer Graphics.

[45]  Bongshin Lee,et al.  InChorus: Designing Consistent Multimodal Interactions for Data Visualization on Tablet Devices , 2020, CHI.

[46]  M. Sheelagh T. Carpendale,et al.  Visual Mementos: Reflecting Memories with Personal Data , 2016, IEEE Transactions on Visualization and Computer Graphics.

[47]  Bongshin Lee,et al.  Mobile Data Visualization (Dagstuhl Seminar 19292) , 2019, Dagstuhl Reports.

[48]  Raimund Dachselt,et al.  Watch+Strap: Extending Smartwatches with Interactive StrapDisplays , 2020, CHI.

[49]  Andries van Dam Post-Wimp User Interfaces: The Human Con-nection , 1997 .

[50]  Kerstin Blumenstein,et al.  Evaluating Information Visualization on Mobile Devices: Gaps and Challenges in the Empirical Evaluation Design Space , 2016, BELIV '16.

[51]  Bongshin Lee,et al.  Investigating data accessibility of personal health apps , 2019, J. Am. Medical Informatics Assoc..

[52]  Michael Rohs,et al.  Valletto: A Multimodal Interface for Ubiquitous Visual Analytics , 2018, CHI Extended Abstracts.

[53]  Marti Hearst,et al.  Would You Like A Chart With That? Incorporating Visualizations into Conversational Interfaces , 2019, 2019 IEEE Visualization Conference (VIS).

[54]  Theresa-Marie Rhyne,et al.  Reaching Broader Audiences With Data Visualization , 2020, IEEE Computer Graphics and Applications.

[55]  Karrie Karahalios,et al.  DataTone: Managing Ambiguity in Natural Language Interfaces for Data Visualization , 2015, UIST.

[56]  James Tompkin,et al.  Evaluating Pan and Zoom Timelines and Sliders , 2019, CHI.

[57]  Nitesh V. Chawla,et al.  Characterizing Exploratory Behaviors on a Personal Visualization Interface Using Interaction Logs , 2020, EuroVis.

[58]  Jillian Aurisano “ Show Me Data . ” Observational Study of a Conversational Interface in Visual Data Exploration , 2015 .

[59]  Ben Shneiderman,et al.  Visual information seeking: tight coupling of dynamic query filters with starfield displays , 1994, CHI Conference Companion.

[60]  John T. Stasko,et al.  Natural Language Interfaces for Data Analysis with Visualization: Considering What Has and Could Be Asked , 2017, EuroVis.

[61]  Bongshin Lee,et al.  Interweaving Multimodal Interaction With Flexible Unit Visualizations for Data Exploration , 2020, IEEE Transactions on Visualization and Computer Graphics.