The PreCertification Kit for Operating Systems in Safety Domains

In present-day, software is taking over functionalities traditionally implemented in hardware, therefore the software architecture has been more complex and large. In such software architecture is common to be present an Operating System (OS). However, in safety domains (e.g., avionic, railway) it is mandatory to be compliant with a safety standard (e.g., DO178B), this means that evidence on the software life cycle of the software components, and therefore also of the OS, should be available. Those evidences that represent the certification package of the OS might not be available for commercial or Open Source OSs, hence their certification requires a complementary creation of evidence to serve as certification inputs. The certification process is costly, thus the system integrator must carefully select the candidate OS. Hence, it would be of great value to support the system integrator in selecting the more suitable OS to certify. In this position paper, we introduce our future research on the development of a Precertification kit (PK), that is, a framework that supports the evaluation of OS in what concerns certification requirements. Also, the PK is a valuable tool that can be integrated in the development toolchain for the implementation of safer and higher quality OS and, provides additional evidences to use for the certification package.

[1]  Karama Kanoun,et al.  Dependability benchmarking for computer systems , 2008 .

[2]  Domenico Cotroneo,et al.  A Case Study on State-Based Robustness Testing of an Operating System for the Avionic Domain , 2011, SAFECOMP.

[3]  Andreas Zeller,et al.  Mining metrics to predict component failures , 2006, ICSE.

[4]  Bertrand Meyer,et al.  The grand challenge of trusted components , 2003, 25th International Conference on Software Engineering, 2003. Proceedings..

[5]  Jim Krodel,et al.  STUDY OF COMMERCIAL OFF-THE-SHELF (COTS) REAL-TIME OPERATING SYSTEMS (RTOS) IN AVIATION APPLICATIONS. , 2002 .

[6]  Silvio Romero de Lemos Meira,et al.  Towards a Software Component Certification Framework , 2007, Seventh International Conference on Quality Software (QSIC 2007).

[7]  Antonio Vallecillo,et al.  Quality Attributes for COTS Components , 2002 .

[8]  Karama Kanoun,et al.  DeBERT: Dependability Benchmarking of Embedded Real-Time Off-the-Shelf Components for Space Applications , 2008 .

[9]  Rushby John,et al.  Partitioning in Avionics Architectures: Requirements, Mechanisms, and Assurance , 1999 .

[10]  Tiago L. Alves,et al.  Deriving metric thresholds from benchmark data , 2010, 2010 IEEE International Conference on Software Maintenance.

[11]  Silvio Romero de Lemos Meira,et al.  A Software Component Quality Model: A Preliminary Evaluation , 2006, 32nd EUROMICRO Conference on Software Engineering and Advanced Applications (EUROMICRO'06).

[12]  Neeraj Suri,et al.  Profiling the operational behavior of OS device drivers , 2008, 2008 19th International Symposium on Software Reliability Engineering (ISSRE).

[13]  R. Dromey,et al.  A Model for Software Product Quality , 1995, IEEE Trans. Software Eng..

[14]  Neeraj Suri,et al.  Profiling the Operational Behavior of OS Device Drivers , 2008, ISSRE.

[15]  Daniel P. Siewiorek,et al.  Comparing operating systems using robustness benchmarks , 1997, Proceedings of SRDS'97: 16th IEEE Symposium on Reliable Distributed Systems.

[16]  Michele Marchesi,et al.  Power-Laws in a Large Object-Oriented Software System , 2007, IEEE Transactions on Software Engineering.

[18]  Domenico Cotroneo,et al.  Is software aging related to software metrics? , 2010, 2010 IEEE Second International Workshop on Software Aging and Rejuvenation.

[19]  Hoyt Lougee,et al.  SOFTWARE CONSIDERATIONS IN AIRBORNE SYSTEMS AND EQUIPMENT CERTIFICATION , 2001 .