Learning Design Semantics for Mobile Apps

Recently, researchers have developed black-box approaches to mine design and interaction data from mobile apps. Although the data captured during this interaction mining is descriptive, it does not expose the design semantics of UIs: what elements on the screen mean and how they are used. This paper introduces an automatic approach for generating semantic annotations for mobile app UIs. Through an iterative open coding of 73k UI elements and 720 screens, we contribute a lexical database of 25 types of UI components, 197 text button concepts, and 135 icon classes shared across apps. We use this labeled data to learn code-based patterns to detect UI components and to train a convolutional neural network that distinguishes between icon classes with 94% accuracy. To demonstrate the efficacy of our approach at scale, we compute semantic annotations for the 72k unique UIs in the Rico dataset, assigning labels for 78% of the total visible, non-redundant elements.

[1]  Jane M. Carey,et al.  Assessing the usability of icons in user interfaces , 1991 .

[2]  Douglas A. Reynolds,et al.  Gaussian Mixture Models , 2018, Encyclopedia of Biometrics.

[3]  Leonidas J. Guibas,et al.  Learning hierarchical shape segmentation and labeling from online repositories , 2017, ACM Trans. Graph..

[4]  Yann LeCun,et al.  The mnist database of handwritten digits , 2005 .

[5]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[6]  Chun-Ching Chen,et al.  User Recognition and Preference of App Icon Stylization Design on the Smartphone , 2015, HCI.

[7]  Tom Yeh,et al.  Collect, Decompile, Extract, Stats, and Diff: Mining Design Pattern Changes in Android Apps , 2015, MobileHCI.

[8]  Christoph Csallner,et al.  P2A: A Tool for Converting Pixels to Animated Mobile Application User Interfaces , 2018, 2018 IEEE/ACM 5th International Conference on Mobile Software Engineering and Systems (MOBILESoft).

[9]  Jeffrey Nichols,et al.  Rico: A Mobile App Dataset for Building Data-Driven Design Applications , 2017, UIST.

[10]  Yvonne Rogers,et al.  Icons at the Interface: Their Usefulness , 1989, Interact. Comput..

[11]  Denys Poshyvanyk,et al.  Machine Learning-Based Prototyping of Graphical User Interfaces for Mobile Apps , 2018, IEEE Transactions on Software Engineering.

[12]  Alireza Sahami Shirazi,et al.  Insights into layout patterns of mobile user interfaces by an automatic analysis of android apps , 2013, EICS '13.

[13]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[14]  Ranjitha Kumar,et al.  ERICA: Interaction Mining Mobile Apps , 2016, UIST.

[15]  Li Fei-Fei,et al.  ImageNet: A large-scale hierarchical image database , 2009, CVPR.

[16]  Shigenobu Kobayashi,et al.  Color Image Scale , 1992 .

[17]  Siddhartha Chaudhuri,et al.  Attribit: content creation with semantic attributes , 2013, UIST.

[18]  Gregory R. Koch,et al.  Siamese Neural Networks for One-Shot Image Recognition , 2015 .

[19]  George A. Miller,et al.  WordNet: A Lexical Database for English , 1995, HLT.

[20]  Tony Beltramelli,et al.  pix2code: Generating Code from a Graphical User Interface Screenshot , 2017, EICS.

[21]  Alex Krizhevsky,et al.  Learning Multiple Layers of Features from Tiny Images , 2009 .

[22]  Elizabeth Boling,et al.  Text Labels for Hypertext Navigation Buttons. , 1998 .

[23]  Jan P. Allebach,et al.  Colors $-$Messengers of Concepts: Visual Design Mining for Learning Color Semantics , 2015 .

[24]  Sepp Hochreiter,et al.  Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) , 2015, ICLR.