On Adopting Linters to Deal with Performance Concerns in Android Apps

With millions of applications (apps) distributed through mobile markets, engaging and retaining end-users challenge Android developers to deliver a nearly perfect user experience. As mobile apps run in resource-limited devices, performance is a critical criterion for the quality of experience. Therefore, developers are expected to pay much attention to limit performance bad practices. On the one hand, many studies already identified such performance bad practices and showed that they can heavily impact app performance. Hence, many static analysers, a.k.a. linters, have been proposed to detect and fix these bad practices. On the other hand, other studies have shown that Android developers tend to deal with performance reactively and they rarely build on linters to detect and fix performance bad practices. In this paper, we therefore perform a qualitative study to investigate this gap between research and development community. In particular, we performed interviews with 14 experienced Android developers to identify the perceived benefits and constraints of using linters to identify performance bad practices in Android apps. Our observations can have a direct impact on developers and the research community. Specifically, we describe why and how developers leverage static source code analysers to improve the performance of their apps. On top of that, we bring to light important challenges faced by developers when it comes to adopting static analysis for performance purposes.

[1]  David Hovemeyer,et al.  Using Static Analysis to Find Bugs , 2008, IEEE Software.

[2]  Daniel G. Oliver,et al.  Constraints and Opportunities with Interview Transcription: Towards Reflection in Qualitative Research , 2005, Social forces; a scientific medium of social study and interpretation.

[3]  Bente Anda,et al.  Experiences from conducting semi-structured interviews in empirical software engineering research , 2005, 11th IEEE International Software Metrics Symposium (METRICS'05).

[4]  Jun Yan,et al.  Characterizing and detecting resource leaks in Android applications , 2013, 2013 28th IEEE/ACM International Conference on Automated Software Engineering (ASE).

[5]  Chih-Pei Hu,et al.  John W. Creswell, Research Design: Qualitative, Quantitative, and Mixed Methods Approaches , 2017 .

[6]  Gabriele Bavota,et al.  Mining energy-greedy API usage patterns in Android apps: an empirical study , 2014, MSR 2014.

[7]  Lenin Ravindranath,et al.  SunCat: helping developers understand and predict performance problems in smartphone applications , 2014, ISSTA 2014.

[8]  Samuel P. Midkiff,et al.  What is keeping my phone awake?: characterizing and detecting no-sleep energy bugs in smartphone apps , 2012, MobiSys '12.

[9]  Christian Bird,et al.  What developers want and need from program analysis: An empirical study , 2016, 2016 31st IEEE/ACM International Conference on Automated Software Engineering (ASE).

[10]  Yepang Liu,et al.  Where has my battery gone? Finding sensor related energy black holes in smartphone applications , 2013, 2013 IEEE International Conference on Pervasive Computing and Communications (PerCom).

[11]  Romain Rouvoy,et al.  Code Smells in iOS Apps: How Do They Compare to Android? , 2017, 2017 IEEE/ACM 4th International Conference on Mobile Software Engineering and Systems (MOBILESoft).

[12]  Christopher Vendome,et al.  How developers detect and fix performance bottlenecks in Android apps , 2015, 2015 IEEE International Conference on Software Maintenance and Evolution (ICSME).

[13]  Andrea De Lucia,et al.  Lightweight detection of Android-specific code smells: The aDoctor project , 2017, 2017 IEEE 24th International Conference on Software Analysis, Evolution and Reengineering (SANER).

[14]  Robert W. Bowdidge,et al.  Why don't software developers use static analysis tools to find bugs? , 2013, 2013 35th International Conference on Software Engineering (ICSE).

[15]  Peter W. O'Hearn,et al.  Moving Fast with Software Verification , 2015, NFM.

[16]  M. Di Penta,et al.  Why and How JavaScript Developers Use Linters , 2017 .

[17]  Romain Rouvoy,et al.  Investigating the energy impact of Android smells , 2017, 2017 IEEE 24th International Conference on Software Analysis, Evolution and Reengineering (SANER).

[18]  Miguel P Caldas,et al.  Research design: qualitative, quantitative, and mixed methods approaches , 2003 .

[19]  Yepang Liu,et al.  Characterizing and detecting performance bugs for smartphone applications , 2014, ICSE.

[20]  Sorin Lerner,et al.  Towards Verifying Android Apps for the Absence of No-Sleep Energy Bugs , 2012, HotPower.

[21]  Ciera Jaspan,et al.  Tricorder: Building a Program Analysis Ecosystem , 2015, 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering.

[22]  Uwe Aßmann,et al.  A Tool-Supported Quality Smell Catalogue For Android Developers , 2014, Softwaretechnik-Trends.

[23]  Ming Zhang,et al.  Bootstrapping energy debugging on smartphones: a first look at energy bugs in mobile devices , 2011, HotNets-X.

[24]  Romain Rouvoy,et al.  Tracking the Software Quality of Android Applications Along Their Evolution (T) , 2015, 2015 30th IEEE/ACM International Conference on Automated Software Engineering (ASE).

[25]  Lei Yang,et al.  ADEL: an automatic detector of energy leaks for smartphone applications , 2012, CODES+ISSS.

[26]  Romain Rouvoy,et al.  An Empirical Study of the Performance Impacts of Android Code Smells , 2016, 2016 IEEE/ACM International Conference on Mobile Software Engineering and Systems (MOBILESoft).

[27]  Philippe Kruchten,et al.  Using grounded theory to study the experience of software development , 2011, Empirical Software Engineering.

[28]  Judith A. Holton,et al.  Remodeling Grounded Theory , 2004 .

[29]  Wei Le,et al.  A comparison of energy bugs for smartphone platforms , 2013, 2013 1st International Workshop on the Engineering of Mobile-Enabled Systems (MOBS).