Impact of interruption style on end-user debugging

Although researchers have begun to explicitly support end-user programmers' debugging by providing information to help them find bugs, there is little research addressing the proper mechanism to alert the user to this information. The choice of alerting mechanism can be important, because as previous research has shown, different interruption styles have different potential advantages and disadvantages. To explore impacts of interruptions in the end-user debugging domain, this paper describes an empirical comparison of two interruption styles that have been used to alert end-user programmers to debugging information. Our results show that negotiated-style interruptions were superior to immediate-style interruptions in several issues of importance to end-user debugging, and further suggest that a reason for this superiority may be that immediate-style interruptions encourage different debugging strategies.

[1]  Mary Czerwinski,et al.  Instant Messaging: Effects of Relevance and Timing , 2000 .

[2]  Ivan Burmistrov,et al.  Do Interrupted Users Work Faster or Slower? The Micro-analysis of Computerized Text Editing Task , 2003 .

[3]  Stephen M. Hess,et al.  Training to Reduce the Disruptive Effects of Interruptions , 1994 .

[4]  Gregg Rothermel,et al.  End-user software engineering with assertions in the spreadsheet paradigm , 2003, 25th International Conference on Software Engineering, 2003. Proceedings..

[5]  John Millar Carroll The Nurnberg Funnel: Designing Minimalist Instruction for Practical Computer Skill , 1990 .

[6]  Brian P. Bailey,et al.  Measuring the effects of interruptions on task performance in the user interface , 2000, Smc 2000 conference proceedings. 2000 ieee international conference on systems, man and cybernetics. 'cybernetics evolving to systems, humans, organizations, and their complex interactions' (cat. no.0.

[7]  Daniel C. McFarlane,et al.  Comparison of Four Primary Methods for Coordinating the Interruption of People in Human-Computer Interaction , 2002, Hum. Comput. Interact..

[8]  Brad A. Myers,et al.  Development and evaluation of a model of programming errors , 2003, IEEE Symposium on Human Centric Computing Languages and Environments, 2003. Proceedings. 2003.

[9]  Gregg Rothermel,et al.  Automated test case generation for spreadsheets , 2002, ICSE '02.

[10]  Brad A. Myers,et al.  Designing the whyline: a debugging interface for asking questions about program behavior , 2004, CHI.

[11]  Henry Lieberman,et al.  An end-user tool for e-commerce debugging , 2003, IUI '03.

[12]  Kenneth R. Koedinger,et al.  Recasting the feedback debate: benefits of tutoring error detection and correction skills , 2003 .

[13]  Joseph S. Valacich,et al.  AIS Electronic , 2022 .

[14]  Margaret M. Burnett,et al.  Forms/3: A first-order visual language to explore the boundaries of the spreadsheet paradigm , 2001, Journal of Functional Programming.

[15]  Philip T. Cox,et al.  Human Centric Computing Languages and Environments , 2003 .

[16]  Gregg Rothermel,et al.  Harnessing curiosity to increase correctness in end-user programming , 2003, CHI '03.

[17]  Margaret M. Burnett,et al.  End-user software visualizations for fault localization , 2003, SoftVis '03.

[18]  Rob Miller,et al.  Outlier finding: focusing user attention on possible errors , 2001, UIST '01.

[19]  John R. Anderson,et al.  Locus of feedback control in computer-based tutoring: impact on learning rate, achievement and attitudes , 2001, CHI.

[20]  Christopher G. Atkeson,et al.  Predicting human interruptibility with sensors: a Wizard of Oz feasibility study , 2003, CHI '03.