Assessing Loss Event Frequencies of Smart Grid Cyber Threats: Encoding Flexibility into FAIR Using Bayesian Network Approach

Assessing loss event frequencies (LEF) of smart grid cyber threats is essential for planning cost-effective countermeasures. Factor Analysis of Information Risk (FAIR) is a well-known framework that can be applied to consider threats in a structured manner by using look-up tables related to a taxonomy of threat parameters. This paper proposes a method for constructing a Bayesian network that extends FAIR, for obtaining quantitative LEF results of high granularity, by means of a traceable and repeatable process, even for fuzzy input. Moreover, the proposed encoding enables sensitivity analysis to show how changes in fuzzy input contribute to the LEF. Finally, the method can highlight the most influential elements of a particular threat to help plan countermeasures better. The numerical results of applying the method to a smart grid show that our Bayesian model can not only provide evaluation consistent with FAIR, but also supports more flexible input, more granular output, as well as illustrates how individual threat components contribute to the LEF.