Experiments on quality evaluation of embedded software in Japan robot software design contest

As a practical opportunity for educating Japanese young developers in the field of embedded software development, a software design contest involving the design of software to automatically control a line-trace robot, and conduct running performance tests was held. In this paper,we give the results of the contest from the viewpoint of software quality evaluation. We create a framework for evaluating the software quality which integrated design model quality and the final system performance, and conduct analysis using the framework. As a result of analysis,it is found that the quantitative measurement of the structural complexity of the design models bears a strong relationship to qualitative evaluation of the design conducted by judges. It is also found that there is no strong correlation between design model quality evaluated by the judges and the final system performance. For embedded software development, it is particularly important to estimate and verify reliability and performance in the early stages,using the model. Based on the analysis result,we consider possible remedies with respect to the models submitted,the evaluation methods used and the contest specifications. In order to adequately measure several non-functional quality characteristics including performance on the model,it is necessary to improve the way of developing robot software (such as applying model driven development)and reexamine the evaluation methods.

[1]  Kilsup Lee,et al.  A quantitative software quality evaluation model for the artifacts of component based development , 2005, Sixth International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing and First ACIS International Workshop on Self-Assembling Wireless Network.

[2]  Atsuhiro Takasu,et al.  Relation Analysis Among Patterns on Software Development Process , 2005, PROFES.

[3]  Paola Inverardi,et al.  Model-based performance prediction in software development: a survey , 2004, IEEE Transactions on Software Engineering.

[4]  Mario Piattini,et al.  Defining Metrics for UML Statechart Diagrams in a Methodological Way , 2003, ER.

[5]  Sweden alete,et al.  What is a Pattern , 2016 .

[6]  José Merseguer,et al.  Performance by unified model analysis (PUMA) , 2005, WOSP '05.

[7]  Miguel Goulão,et al.  Toward the Design Quality Evaluation of Object-Oriented Software Systems , 1995 .

[8]  Connie U. Smith,et al.  Software performance antipatterns , 2000, WOSP '00.

[9]  Thomas J. Mowbray,et al.  AntiPatterns: Refactoring Software, Architectures, and Projects in Crisis , 1998 .

[10]  James D. Herbsleb,et al.  The eXtreme programming (XP) metaphor and software architecture , 2003 .

[11]  Hany H. Ammar,et al.  Dynamic metrics for object oriented designs , 1999, Proceedings Sixth International Software Metrics Symposium (Cat. No.PR00403).

[12]  Michele Marchesi,et al.  Extreme Programming and Agile Processes in Software Engineering , 2003, Lecture Notes in Computer Science.

[13]  Jana Polgar,et al.  Object-Oriented Software Metrics , 2005, Encyclopedia of Information Science and Technology.

[14]  Barry W. Boehm,et al.  A spiral model of software development and enhancement , 1986, Computer.

[15]  Rajiv D. Banker,et al.  Software complexity and maintenance costs , 1993, CACM.

[16]  Simonetta Balsamo,et al.  UML-PSI: the UML performance simulator , 2004, First International Conference on the Quantitative Evaluation of Systems, 2004. QEST 2004. Proceedings..

[17]  Bran Selic,et al.  The Pragmatics of Model-Driven Development , 2003, IEEE Softw..

[18]  Bran Selic,et al.  Real-time object-oriented modeling , 1994, Wiley professional computing.

[19]  Glenford J. Myers,et al.  Structured Design , 1999, IBM Syst. J..

[20]  Kyo Chul Kang,et al.  Feature-Oriented Domain Analysis (FODA) Feasibility Study , 1990 .