Hazard Analysis of Verification Supporting Arguments for Assured Autonomy

The kinds of systems we are building, and the ways we are building them, are evolving. This evolution is invalidating analyses and assumptions upon which we have relied as bases for design assurance, imposing a need for new criteria and means of compliance for many autonomy-enabling technologies. While significant investigation activity into assurance bases for these technologies is underway across research, development, and standards bodies, the community will need to make sense of results coming out. We require evaluation frameworks and decision support to establish trust in, and guide selection of, new verification concepts and methods. In this work, we propose a lens for the evaluation of verification methods in development to ground new criteria, standards, and means of compliance for assuring and approving adaptive and intelligent systems. We root the evaluation framework in examining verification as a system in its own right, with a job to do and ways it can fail to do it. We then outline a structured argument in which it can be concluded a verification method is fit for purpose if it meets its requirements and the hazards of its use are adequately mitigated. To identify these hazards, we illustrate how industry-standard hazard analysis can be performed on verification itself, and how the results of such an analysis can be integrated into structured arguments supporting stakeholder communication and decision making. Finally, we note environments where we are beginning to use this approach, including to provide feedback within standards development activity.

[1]  T. H. Allegri The Code of Federal Regulations , 1986 .

[2]  John Knight,et al.  Fundamentals of Dependable Computing for Software Engineers , 2012 .

[3]  John Schulman,et al.  Concrete Problems in AI Safety , 2016, ArXiv.

[4]  Darren D. Cofer You keep using that word , 2015, SIGL.

[5]  J Hayhurst Kelly,et al.  A Practical Tutorial on Modified Condition/Decision Coverage , 2001 .

[6]  S. Bhattacharyya,et al.  Certification considerations for adaptive systems , 2015, 2015 International Conference on Unmanned Aircraft Systems (ICUAS).

[7]  Mike DeWalt,et al.  Technology independent assurance method , 2014, 2014 IEEE/AIAA 33rd Digital Avionics Systems Conference (DASC).

[8]  Jane Cleland-Huang,et al.  How Do Practitioners Perceive Assurance Cases in Safety-Critical Software Systems? , 2018, 2018 IEEE/ACM 11th International Workshop on Cooperative and Human Aspects of Software Engineering (CHASE).

[9]  Whole Grain Label Statements Guidance for Industry and FDA Staff , 2006 .

[10]  R. Smith,et al.  Department of Defense. , 2020, Military medicine.

[11]  Hoyt Lougee,et al.  SOFTWARE CONSIDERATIONS IN AIRBORNE SYSTEMS AND EQUIPMENT CERTIFICATION , 2001 .

[12]  Philip Koopman,et al.  Challenges in Autonomous Vehicle Testing and Validation , 2016 .

[13]  G Frank McCormick Certification of Civil Avionics , 2000 .