The past as well as the present are full of examples of risk blindness, risk ignorance and downright stupidity. For instance, "The Reactor Safety Study" (Rasmussen, 1975) shows a prime case of ignorance in evaluating risks, implementing necessary actions and dogmatic discussions on probabilistic and qualitative approaches to risk assessment. In this report, the MIT physicist and scientist Norman Rasmussen introduced a new probability theory method of quantitative security and risk assessment called probabilistic risk assessment (PRA) to account for the increasing size and complexity of traditional fault trees (fault tree analysis, FTA).
In the years to follow, this report was widely criticised due, in part, to the quantitative assessment of various risk scenarios. It had, however, correctly anticipated and evaluated the potential effects of a tsunami on a nuclear power plant. The report concluded that some power plants that could be hit by a tsunami or higher water levels caused by tornadoes would have to take the highest expected waves and water levels into account – in other words, the worst imaginable case.
At the end of the day most risks have a financial impact
That exact scenario became reality in 2011. On March 11 of that year an earthquake off the coast of Tōhoku, Japan triggered a tsunami that flooded a 500 km2 area along its Pacific coast. The Japanese nuclear power plant, Fukushima Daiichi (Fukushima I), was subsequently bombarded with 13–15 meter-high tsunami waves. Since Fukushima I was not connected to the existing tsunami warning system, the operating personnel did not receive any advance notice. In addition, the protection wall for the premises along the ocean was only 5.7 meters tall.
After analysing the causes, the Nuclear Accident Independent Investigation Commission (NAIIC) came to the unequivocal conclusion "that the accident was clearly ‘man-made'. We believe that the root causes were the organisational and regulatory systems that supported faulty rationales for decisions and actions" (NAIIC, 2012). The WASH-1400 report, as it was known, was not taken seriously by those responsible, and a simpler and more benevolent risk recommendation was preferred.
We call this kind of ignoring of critical risks "Cassandra scenarios". The name refers to Cassandra from Greek mythology, who predicted the fall of Troy. The god Apollo, who loved Cassandra, gave her the gift of divination. However, since Cassandra did not return Apollo's love, he made the gift ineffective and condemned her to the fact that no one would believe her (correct) prophecies. And so it happened: Cassandra predicted the destruction of Troy and nobody cared. The Trojans, who had been drunk with victory up to that point, did not listen to her and pulled the wooden horse into their city, which ultimately led to their downfall at the hands of the Greeks. That is why even today people speak of a "Cassandra's call" when a warning goes unheard.
Does this phenomenon sound familiar? In many companies, this is not uncommon practice. Sometimes it is much more comfortable to turn a blind eye to critical stress scenarios or major threats to strategic success factors. And if the scenario does occur, one can point out that it was a "black swan".
Read the full article:
Frank Romeike / Stefan Koppold (2021): The painful financial side of NFR, in: Thomas Kaiser (Ed.): Non-Financial-Risk Management – Emerging stronger after Covid-19, Risk Books, London 2021.