An examination of the run-time performance of GUI creation frameworks

The graphical user interface (GUI) is an important component of many software systems. Past surveys indicate that the development of a GUI is a significant undertaking and that the GUI's source code often comprises a substantial portion of the program's overall source base. Graphical user interface creation frameworks for popular object-oriented programming languages enable the rapid construction of simple and complex GUIs. In this paper, we examine the run-time performance of two GUI creation frameworks, Swing and Thinlet, that are tailored for the Java programming language. Using a simple model of a Java GUI, we formally define the difficulty of a GUI manipulation event. After implementing a case study application, we conducted experiments to measure the event handling latency for GUI manipulation events of varying difficulties. During our investigation of the run-time performance of the Swing and Thinlet GUI creation frameworks, we also measured the CPU and memory consumption of our candidate application during the selected GUI manipulation events. Our experimental results indicate that Thinlet often outperformed Swing in terms of both event handling latency and memory consumption. However, Swing appears to be better suited, in terms of event handling latency and CPU consumption, for the construction of GUIs that require manipulations of high difficulty levels.

[1]  James F. Power,et al.  Measurement and analysis of runtime profiling data for Java programs , 2001, Proceedings First IEEE International Workshop on Source Code Analysis and Manipulation.

[2]  Tim Comber,et al.  Investigating Layout Complexity , 1996, CADUI.

[3]  Ben Shneiderman,et al.  Response time and display rate in human performance with computers , 1984, CSUR.

[4]  Robert Eckstein,et al.  Java Swing , 1998 .

[5]  Brad A. Myers,et al.  Past, Present and Future of User Interface Software Tools , 2000, TCHI.

[6]  Mary Lou Soffa,et al.  Coverage criteria for GUI testing , 2001, ESEC/FSE-9.

[7]  Zheng Wang,et al.  Using latency to evaluate interactive system performance , 1996, OSDI '96.

[8]  Steve Wilson,et al.  Java Platform Performance - Strategies and Tactics , 2000 .

[9]  Mary Lou Soffa,et al.  Automated test oracles for GUIs , 2000, SIGSOFT '00/FSE-8.

[10]  Mary Beth Rosson,et al.  Survey on user interface programming , 1992, CHI.

[11]  Atif M. Memon GUI Testing: Pitfalls and Process , 2002, Computer.

[12]  Jan L. Guynes Impact of system response time on state anxiety , 1988, CACM.

[13]  Margo I. Seltzer,et al.  Improving interactive performance using TIPME , 2000, SIGMETRICS '00.

[14]  David J. Kasik,et al.  Toward automatic generation of novice user test scripts , 1996, CHI.

[15]  Dick A. C. Quartel,et al.  Performance monitoring of java applications , 2002, WOSP '02.

[16]  Robin Jeffries,et al.  User interface evaluation in the real world: a comparison of four techniques , 1991, CHI.

[17]  Ewald Geschwinde,et al.  PostgreSQL Developer's Handbook , 2001 .

[18]  Andrew Sears,et al.  Layout Appropriateness: A Metric for Evaluating User Interface Widget Layout , 1993, IEEE Trans. Software Eng..

[19]  Mary Lou Soffa,et al.  Using a goal-driven approach to generate test cases for GUIs , 1999, Proceedings of the 1999 International Conference on Software Engineering (IEEE Cat. No.99CB37002).

[20]  James A. Whittaker,et al.  What is software testing? And why is it so hard? , 2000 .