Reliability and risk analysis data base development: an historical perspective

Abstract Collection of empirical data and data base development for use in the prediction of the probability of future events has a long history. Dating back at least to the 17th century, safe passage events and mortality events were collected and analyzed to uncover prospective underlying classes and associated class attributes. Tabulations of these developed classes and associated attributes formed the underwriting basis for the fledgling insurance industry. Much earlier, master masons and architects used design rules of thumb to capture the experience of the ages and thereby produce structures of incredible longevity and reliability (Antona, E., Fragola, J. & Galvagni, R. Risk based decision analysis in design. Fourth SRA Europe Conference Proceedings , Rome, Italy, 18–20 October 1993). These rules served so well in producing robust designs that it was not until almost the 19th century that the analysis (Charlton, T.M., A History Of Theory Of Structures In The 19th Century , Cambridge University Press, Cambridge, UK, 1982) of masonry voussoir arches, begun by Galileo some two centuries earlier (Galilei, G. Discorsi e dimostrazioni mathematiche intorno a due nuove science , (Discourses and mathematical demonstrations concerning two new sciences, Leiden, The Netherlands, 1638), was placed on a sound scientific basis. Still, with the introduction of new materials (such as wrought iron and steel) and the lack of theoretical knowledge and computational facilities, approximate methods of structural design abounded well into the second half of the 20th century. To this day structural designers account for material variations and gaps in theoretical knowledge by employing factors of safety (Benvenuto, E., An Introduction to the History of Structural Mechanics, Part II: Vaulted Structures and Elastic Systems , Springer-Verlag, NY, 1991) or codes of practice (ASME Boiler and Pressure Vessel Code, ASME, New York) originally developed in the 19th century (Antona, E., Fragola, J. & Galvagni, R. Risk based decision analysis in design. Fourth SRA Europe Conference Proceedings , Rome, Italy, 18–20 October 1993). These factors, although they continue to be heuristically based, attempt to account for uncertainties in the design environment (e.g., the load spectra) and residual materials defects (Fragola, J.R. et al. , Investigation of the risk implications of space shuttle solid rocket booster chamber pressure excursions. SAIC Document No. SAIC/NY 95-01-10, New York, NY). Although the approaches may appear different, at least at first glance, the intention in both the insurance and design arenas was to establish an ‘infrastructure of confidence’ to enable rational decision making for future endeavours. Maturity in the design process of conventional structures such as bridges, buildings, boilers, and highways has led to the loss of recognition of the role that robustness plays in these designs to qualify them against their normal failure environment. So routinely do we expect these designs to survive that we tend to think of the individual failures (which do occur on occasion) as isolated ‘freak’ accidents. Attempts to uncover potential underlying classes and document associated attributes are rare, and even when they are undertaken ‘human error’ or ‘one-of-a-kind accidents’ is often cited as the major cause which somehow seems to absolve the analyst from the responsibility of further data collection (Levy, M. & Salvadori, M., Why Buildings Fall Down , W.W. Norton and Co., New York, NY, 1992; Pecht, M., Nash, F.R. & Long, J.H., Understanding and solving the real reliability assurance problems. 1995 Proceedings of Annual RAMS Symposium , IEEE, New York, NY, 1995). The confusion has proliferated to the point where legitimate calls for scepticism regarding the scant data resources available (Evans, R.A., Bayes paradox. IEEE Trans. Reliab. , R-31 (1982) 321) have given way to cries that some data sources be abandoned altogether (Cushing, M. et al. , Comparison of electronics-reliability assessment approaches. Trans. Reliab. , 42 (1993) 542–546 Watson, G.F., MIL Reliability: a new approach. IEEE Spectrum , 29 (1992) 46–49). Authors who have suggested that the concept of generic data collection be abolished in favor of a physics-of-failure approach (Watson, G.F., MIL Reliability: a new approach. IEEE Spectrum , 29 (1992) 46–49) now seem to be suggesting that the concept of ‘failure rate’ be banished altogether and with it the concept of reliability prediction (Pecht., M. & Nash, F., Predicting the reliability of electronic equipment. Proc. IEEE , 82 (1994) 992–1004). There can be no doubt that abuses of generic data exist and that the physics-of-failure approach has merit, especially in design development, however, does the situation really justify the abandonment of the collection, analysis, and classification of empirical failure data and the elimination of reliability or risk prediction? If not, can the concepts of ‘failure rate’ and ‘prediction’ be redefined so as to allow for meaningful support to be provided to logical decision making? This paper reviews both the logical and historical context within which reliability and risk data bases have been developed so as to generate an understanding of the motivations for and the assumptions underlying their development. Further, an attempt is made to clarify what appears to be fundamental confusion in the field of reliability and risk analysis. With these clarifications in hand, a restructuring of the conceptual basis for reliability data base development and reliability predictions is suggested, and some hopeful recent developments are reported upon.