An approach for the evaluation of software engineering environments in medicine.

This article examines several criteria for the evaluation of software engineering environments (SEE) in medicine. The study is restricted to the evaluation of the SEE itself, not of its by-products which are the medical applications developed with the SEE. Basic principles for an evaluation methodology are presented. They consist in determining the evaluation objectives and judging a SEE according to criteria which are grouped into three broad categories--functional, generic and environmental. Each category reflects a particular domain of evaluation of the SEE. Methods of measurement and questions highlighting these specific areas are mentioned. Criteria are extracted from the list of objectives that follows the HELIOS European AIM project of the Commission of the European Communities. Special emphasis is drawn on the criteria for which the medical specificity and usefulness of a SEE can be approach. For this purpose a method of measurement of such appropriateness is proposed.

[1]  Jack C. Wileden,et al.  Foundations for the Arcadia environment architecture , 1989, SDE 3.

[2]  Ian Thomas,et al.  An overview of PCTE and PCTE+ , 1989, SDE 3.

[3]  Leonard L. Tripp,et al.  CASE evaluation and selection bibliography , 1990, SOEN.

[4]  Patrick Borras,et al.  Centaur: the system , 1988, Software Development Environments.

[5]  Jay F. Nunamaker,et al.  CASE productivity perceptions of software engineering professionals , 1989, CACM.

[6]  A. Nico Habermann,et al.  Software Development Environments , 1987, Computer.

[7]  D.R. Wallace,et al.  Software verification and validation: an overview , 1989, IEEE Software.

[8]  Robert M. Poston,et al.  Evaluating and selecting testing tools , 1992, IEEE Software.

[9]  Lennart Ohlsson,et al.  Software factory principles, architecture, and experiments , 1992, IEEE Software.

[10]  Anthony I. Wasserman,et al.  A graphical, extensible integrated environment for software development , 1987, SDE 2.

[11]  Gene Forte,et al.  Tools Fair: Out of the Lab, Onto the Shelf , 1992 .

[12]  J R Scherrer,et al.  Natural Language Processing and Semantical Representation of Medical Texts , 1992, Methods of Information in Medicine.

[13]  A. Nico Habermann,et al.  A methodology for evaluating environments , 1987, SDE 2.

[14]  W. D. Dominick,et al.  A performance measurement and evaluation environment for information systems , 1987, Inf. Process. Manag..

[15]  H. Holbrook,et al.  A scenario-based methodology for conducting requirements elicitation , 1990, SOEN.

[16]  Victor R. Basili,et al.  Identifying and qualifying reusable software components , 1991, Computer.

[17]  Kari Laitinen Document classification for software quality systems , 1992, SOEN.

[18]  Erhard Plödereder,et al.  Object management issues for software engineering environments workshop report , 1988, SDE 3.

[19]  Matthias Jarke,et al.  DAIDA: an environment for evolving information systems , 1992, TOIS.

[20]  Clifford C. Huff,et al.  Elements of a realistic CASE tool adoption budget , 1992, CACM.

[21]  Eugene L. Duke,et al.  A rapid prototyping facility for flight research in advanced systems concepts , 1989, Computer.

[22]  Barry W. Boehm,et al.  A spiral model of software development and enhancement , 1986, Computer.

[23]  Chris F. Kemerer,et al.  Now the learning curve affects CASE tool adoption , 1992, IEEE Software.

[24]  Tom Strelich The Software Life Cycle Support Environment (SLCSE): a computer based framework for developing software systems , 1988, SDE 3.