Quality in use of domain-specific languages: a case study

Domain-Specific Languages (DSLs) are claimed to increment productivity, while reducing the required maintenance and programming expertise. In this context, DSLs usability is a key factor for its successful adoption. In this paper, we propose a systematic approach based on User Interfaces Experimental validation techniques to assess the impact of the introduction of DSLs on the productivity of domain experts. To illustrate this evaluation approach we present a case study of a DSL for High Energy Physics (HEP). The DSL on this case study, called Pheasant (PHysicist's EAsy Analysis Tool), is assessed in contrast with a pre-existing baseline, using General Purpose Languages (GPLs) such as C++. The comparison combines quantitative and qualitative data, collected with users from a real-world setting. Our assessment includes Physicists with programming experience with two profiles; ones with no experience with the previous framework used in the project and other experienced. This work's contribution highlights the problem of the absence of systematic approaches for experimental validation of DSLs. It also illustrates how an experimental approach can be used in the context of a DSL evaluation during the Software Languages Engineering activity, with respect to its impact on effectiveness and efficiency.

[1]  Jeffrey C. Carver,et al.  Program comprehension of domain-specific and general-purpose languages: comparison using a family of experiments , 2011, Empirical Software Engineering.

[2]  Tim Sheard,et al.  A software engineering experiment in software component generation , 1996, Proceedings of IEEE 18th International Conference on Software Engineering.

[3]  Tiziana Catarci,et al.  What Happened When Database Researchers Met Usability , 2000, Inf. Syst..

[4]  Colin Atkinson,et al.  Model-Driven Development: A Metamodeling Foundation , 2003, IEEE Softw..

[5]  Shane Markstrum Staking claims: a history of programming language design claims and evidence: a positional work in progress , 2010, PLATEAU '10.

[6]  Miguel Goulão,et al.  Modeling the Experimental Software Engineering Process , 2007, 6th International Conference on the Quality of Information and Communications Technology (QUATIC 2007).

[7]  Debora Shaw,et al.  Handbook of usability testing: How to plan, design, and conduct effective tests , 1996 .

[8]  Randolph G. Bias,et al.  Cost-Justifying Usability: An Update for the Internet Age , 2005 .

[9]  W. Buxton Human-Computer Interaction , 1988, Springer Berlin Heidelberg.

[10]  Miguel Goulão,et al.  Quality in Use of DSLs: Current Evaluation Methods , 2011 .

[11]  Lutz Prechelt,et al.  An Empirical Comparison of Seven Programming Languages , 2000, Computer.

[12]  Hau-San Wong,et al.  Human Computer Interaction , 2006, Encyclopedia of Multimedia.

[13]  Anneke Kleppe,et al.  Software Language Engineering: Creating Domain-Specific Languages Using Metamodels , 2008 .

[14]  Vasco Amaral,et al.  Increasing productivity in high energy physics data mining with a domain specific visual query language , 2004 .

[15]  Ergonomic requirements for office work with visual display terminals ( VDTs ) — Part 11 : Guidance on usability , 1998 .

[16]  Miguel Goulão,et al.  Do Software Languages Engineers Evaluate their Languages? , 2011, CIbSE.

[17]  M Mernik,et al.  When and how to develop domain-specific languages , 2005, CSUR.

[18]  Carole A. Goble,et al.  Kaleidoquery-A Flow-based Visual Language and its Evaluation , 2000, J. Vis. Lang. Comput..

[19]  Steven Kelly,et al.  Visual domain-specific modelling : Benefits and experiences of using metaCASE tools , 2010 .

[20]  Oscar Mauricio Serrano Jaimes,et al.  EVALUACION DE LA USABILIDAD EN SITIOS WEB, BASADA EN EL ESTANDAR ISO 9241-11 (International Standard (1998) Ergonomic requirements For office work with visual display terminals (VDTs)-Parts II: Guidance on usability , 2012 .