Usefulness and usability of heuristic walkthroughs for evaluating domain-specific developer tools in industry: Evidence from four field simulations

[1]  Luca Berardinelli,et al.  Visualizing Multi-dimensional State Spaces Using Selective Abstraction , 2020, 2020 46th Euromicro Conference on Software Engineering and Advanced Applications (SEAA).

[2]  Federico Botella,et al.  Programmer eXperience: A Set of Heuristics for Programming Environments , 2020, HCI.

[3]  N. Menold,et al.  Rating-Scale Labeling in Online Surveys: An Experimental Comparison of Verbal and Numeric Rating Scales with Respect to Measurement Quality and Respondents’ Cognitive Processes , 2020 .

[4]  Thomas Weber,et al.  Usability of Development Tools: A CASE-Study , 2019, 2019 ACM/IEEE 22nd International Conference on Model Driven Engineering Languages and Systems Companion (MODELS-C).

[5]  Brian Fitzgerald,et al.  The ABC of Software Engineering Research , 2018, ACM Trans. Softw. Eng. Methodol..

[6]  Miguel Goulão,et al.  Usability driven DSL development with USE-ME , 2018, Comput. Lang. Syst. Struct..

[7]  Michael Greenstein,et al.  Not all Anchors Weigh Equally: Differences Between Numeral and Verbal Anchors , 2017, Experimental psychology.

[8]  Bernhard Hoisl,et al.  Reusable and generic design decisions for developing UML-based domain-specific languages , 2017, Inf. Softw. Technol..

[9]  Avelino Francisco Zorzo,et al.  Usability Evaluation of Domain-Specific Languages: A Systematic Literature Review , 2017, HCI.

[10]  Thomas D. LaToza,et al.  Programmers Are Users Too: Human-Centered Methods for Improving Programming Tools , 2016, Computer.

[11]  Fraser McKay,et al.  Heuristic Evaluation for Novice Programming Systems , 2016 .

[12]  Ruth Breu,et al.  An integrated tool environment for experimentation in domain specific language engineering , 2016, EASE.

[13]  Marjan Mernik,et al.  Domain-Specific Languages: A Systematic Mapping Study , 2017, SOFSEM.

[14]  Eelco Visser,et al.  Evaluating and comparing language workbenches: Existing results and benchmarks for the future , 2015, Comput. Lang. Syst. Struct..

[15]  Semih Bilgen,et al.  A framework for qualitative assessment of domain-specific languages , 2015, Software & Systems Modeling.

[16]  Uirá Kulesza,et al.  Assessing and Evolving a Domain Specific Language for Formalizing Software Engineering Experiments: An Empirical Study , 2014, Int. J. Softw. Eng. Knowl. Eng..

[17]  Brad A. Myers,et al.  A case study of using HCI methods to improve tools for programmers , 2012, 2012 5th International Workshop on Co-operative and Human Aspects of Software Engineering (CHASE).

[18]  Michael Derntl,et al.  The Impact of Perceived Cognitive Effectiveness on Perceived Usefulness of Visual Conceptual Modeling Languages , 2011, ER.

[19]  Miguel Goulão,et al.  Do Software Languages Engineers Evaluate their Languages? , 2011, CIbSE.

[20]  Christophe Kolski,et al.  State of the Art on the Cognitive Walkthrough Method, Its Variants and Evolutions , 2010, Int. J. Hum. Comput. Interact..

[21]  Gabor Karsai,et al.  DSLs: the good, the bad, and the ugly , 2008, OOPSLA Companion.

[22]  David G. Novick,et al.  Usability inspection methods after 15 years of research and practice , 2007, SIGDOC '07.

[23]  Peter Wittenburg,et al.  ELAN: a Professional Framework for Multimodality Research , 2006, LREC.

[24]  Ahmed Seffah,et al.  Evaluation of integrated software development environments: Challenges and results from three empirical studies , 2005, Int. J. Hum. Comput. Stud..

[25]  Daniel M. Germán,et al.  Improving the usability of Eclipse for novice programmers , 2003, eclipse '03.

[26]  Ahmed Seffah,et al.  Quantifying developer experiences via heuristic and psychometric evaluation , 2002, Proceedings IEEE 2002 Symposia on Human Centric Computing Languages and Environments.

[27]  Rick Spencer,et al.  The streamlined cognitive walkthrough method, working around social constraints encountered in a software development company , 2000, CHI.

[28]  Andrew Sears,et al.  Heuristic Walkthroughs: Finding the Problems Without the Noise , 1997, Int. J. Hum. Comput. Interact..

[29]  Barbara Kitchenham,et al.  DESMET: a methodology for evaluating software engineering methods and tools , 1997 .

[30]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[31]  Jill Gerhardt-Powals Cognitive engineering principles for enhancing human-computer performance , 1996, Int. J. Hum. Comput. Interact..

[32]  Jakob Nielsen,et al.  Heuristic Evaluation of Prototypes (individual) , 2022 .

[33]  Randolph G. Bias,et al.  The pluralistic usability walkthrough: coordinated empathies , 1994 .

[34]  Cathleen Wharton,et al.  The cognitive walkthrough method: a practitioner's guide , 1994 .

[35]  Cathleen Wharton,et al.  Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces , 1990, CHI '90.

[36]  Federico Botella,et al.  Programmer eXperience: A Systematic Literature Review , 2019, IEEE Access.

[37]  Thomas D. LaToza,et al.  Human-Centered Methods to Boost Productivity , 2019, Rethinking Productivity in Software Engineering.

[38]  Stefan Sobernig,et al.  A Daily Dose of DSL - MDE Micro Injections in Practice , 2018, MODELSWARD.

[39]  Christian Burghard,et al.  Introducing MDML-A Domain-specific Modelling Language for Automotive Measurement Devices , 2016 .

[40]  Uirá Kulesza,et al.  Automated Support for Controlled Experiments in Software Engineering: A Systematic Review (S) , 2013, SEKE.

[41]  Markus Völter,et al.  MD* Best Practices , 2009, J. Object Technol..