Context and motivation. User stories are an increasingly popular textual notation to capture requirements in agile software development [6]. User stories only capture the essential elements of a requirement: who it is for, what is expected from the system, and, optionally, why. Popularized by Mike Cohn [2], the most well-known format is: “As a 〈type of user〉 , I want 〈goal〉, [so that 〈some reason〉]”. For example: “As a Marketeer, I want to receive an email when a contact form is submitted, so that I can respond to it”. Question/Problem. Despite this popularity, the number of methods to assess and improve user story quality is small. Existing approaches either employ highly qualitative metrics, such as the six mnemonic heuristics of the INVEST framework [10], or propose generic guidelines for quality in agile RE [4]. This prompted us to introduce the Quality User Story (QUS) framework in earlier work [5], formulating a comprehensive linguistic approach to user story quality. The QUS framework separates the algorithmic aspects that natural language processing (NLP) techniques can automatically process from the thinking-required concerns which necessitate involving human requirements engineers. Our earlier work illustrates each quality criterion with a real-world example to demonstrate that the quality defect occur in practice [5]. Principal ideas/results. We take advantage of the potential offered by NLP. However, existing state-of-the-art NLP tools for RE such as Dowser [8] and RAI [3] are unable to transcend from academia into practice because their output is too inaccurate. The ambitious objectives of these tools necessitate a deep understanding of the requirements’ contents [1]. This necessity is currently unachievable and will remain impossible to achieve in the foreseeable future [9]. Instead, to be effective, tools that want to harness NLP should focus on the clerical part of RE that software can perform with 100% recall and high precision, leaving other work to human requirements engineers [1]. Additionally, they should conform to what practitioners actually do, instead of what the published methods and processes advise them to do [7]. The popularity of user stories among practitioners and their simple yet strict structure make them ideal candidates for applying NLP tools and techniques. Contribution. The Automatic Quality User Story Artisan tool1 or AQUSA takes a set of user stories as input and outputs errors and warnings that expose possible defects. Specifically, defects are identified by comparing the user stories with a subset of QUS criteria [5]. A first release of this tool was completed in October 2015 and we
[1]
Andy Zaidman,et al.
A Quality Framework for Agile Requirements: A Practitioner's Perspective
,
2014,
ArXiv.
[2]
Nenad Medvidovic,et al.
Reducing Ambiguities in Requirements Specifications Via Automatically Created Object-Oriented Models
,
2008,
Monterey Workshop.
[3]
Peter Sawyer,et al.
The Case for Dumb Requirements Engineering Tools
,
2012,
REFSQ.
[4]
Mike Cohn,et al.
User Stories Applied: For Agile Software Development
,
2004
.
[5]
Peter Sawyer,et al.
On the Effectiveness of Abstraction Identification in Requirements Engineering
,
2010,
2010 18th IEEE International Requirements Engineering Conference.
[6]
Kevin Ryan,et al.
The role of natural language in requirements engineering
,
1993,
[1993] Proceedings of the IEEE International Symposium on Requirements Engineering.
[7]
Neil A. M. Maiden.
Exactly How Are Requirements Written?
,
2012,
IEEE Software.
[8]
Sjaak Brinkkemper,et al.
The Use and Effectiveness of User Stories in Practice
,
2016,
REFSQ.
[9]
Sjaak Brinkkemper,et al.
Forging high-quality User Stories: Towards a discipline for Agile Requirements
,
2015,
2015 IEEE 23rd International Requirements Engineering Conference (RE).