Evaluations of software technologies: testing, cleanroom, and metrics (development methodology, characteristic set, offline software review, empirical study)
暂无分享,去创建一个
Abstract : A 7-step approach for quantitatively evaluating software technologies couples software methodology evaluation with software measurement. The approach is applied in-depth in (1) Software Testing Strategies: A 74-subject study, including 32 professional programmers and 42 advanced university students, compared code reading, functional testing, and structural testing in a fractional factorial design. (2) Cleanroom Software Development: Fifteen three-person teams separately built a 1200-line message system to compare Cleanroom software development (in which software is developed completely off-line) with a more traditional approach. (3) Characteristic Software Metric Sets: In the NASA S.E.L. production environment, a study of 65 candidate product and process measures of 652 modules from six (51,000 -112,000 line) projects yielded a characteristic set of software cst/quality metrics. The approach described for quantitatively evaluating software technologies was effective in a variety of problem domains. With the professionals, code reading detected more software faults and had a higher fault detection rate than did functional or structural testing. With the students, the 3 techniques were not noticeably different in the number of faults detected or in the fault detection rate. Code reading detected more interface faults and functional testing detected more control faults than did the other methods. Most developers using the Cleanroom software development approach were able to build systems completely off-line. The Cleanroom teams' products met system requirements more completely and succeeded on more operational test cases than did those developed with traditional approach.