Probabilistic models give us knowledge

The Demystification of Quantification


Kolumne

Both the past and the present are full of examples of risk blindness, risk ignorance and downright stupidity. "The Reactor Safety Study" (WASH-1400) from the year 1975 delivers a prime example of the ignorance in evaluating risks and implementing necessary actions.

In this report the MIT physicist and scientist, Norman Rasmussen, introduced new probability theory methods of quantitative security and risk assessment called probabilistic risk assessment (PRA) to account for the increasing size and complexity of traditional fault trees.

In the years to follow, this report was widely attacked and criticized due, in part, to the quantitative assessment of various risk scenarios. It had, however, correctly anticipated and evaluated the potential effects of a tsunami on a nuclear power plant. The report concluded that some plants that could be hit by a tsunami or higher water levels caused by tornadoes would have to take the highest expected waves and water levels into account – in other words, the worst imaginable case.

That exact scenario became reality in 2011. On March 11, 2011, an earthquake off the shores of Töhuku, Japan, triggered a tsunami that flooded a 500 km² area along its Pacific coast. The Japanese nuclear power plant, Fukushima Daiichi (Fukushima I), was subsequently bombarded with 13- to 15-meter-high tsunami waves. Since Fukushima I was not connected to the existing tsunami warning system, the operating personnel did not receive any advance notice. In addition, the protection wall for the premises along the ocean was only 5.7 meters tall.

Knowledge is not enough; it must be applied

The poet and natural scientist, Johann Wolfgang von Goethe, once said: "Knowing is not enough; we must apply. Willing is not enough; we must do." Yet nothing was done to implement the insights from Rasmussen's report. After all, it is much easier to use a black swan as an excuse for non-existent risk management.

Before the discovery of Australia, the old-world belief was that all swans were white. This was unchallengeable because all empirical evidence appeared to confirm this. After the first black swan had been sighted, that school of thought was shaken to its core. This example illustrates how quickly (historical) knowledge can be shattered and how we can be hampered or blinded by what we have learned through observations and experience. Since most observers at the time didn't think outside the box, they just assumed that all swans are white. Their experiences had made it simply incomprehensible to imagine that black swans actually exist in all states of Australia, including the island of Tasmania.

Steering a company through a rear-view mirror

It is a well-known fact that humans systematically underestimate the painful effects of extreme events. The reasons for that are simple. We think in terms of logical stories, link facts to a suitable picture, and use the past as a model for the future – all to create a world in which we feel comfortable. The reality, however, is chaotic, surprising and unpredictable. Oftentimes, the past is a poor consultant that restricts our view. After all, we don't drive cars looking through the rear-view mirror. Enterprise management is no different.

The black-swan phenomenon is tightly interconnected with an underlying philosophical problem of induction, in other words, applying (finite) historical data to the future. This creates a conflict because relevant, extreme events are rare and, accordingly, have probably never occurred in the observed historical period.

Yet the Fukushima nuclear catastrophe, Challenger explosion, COVID-19 pandemic, Wirecard scandal and a potential black-out scenario are not black swans.

A black swan is often used as an excuse for one's own failure to combine adequate anticipatory capabilities with competent methodologies.

We couldn't have known that. It was corruption

When it comes to analyzing risk blindness in the case of Wirecard, I have heard time and time again, "No one could have known that. It was corruption…"

Interestingly, multiple analysts had conducted meticulous analyses of both the balance sheets and data and concluded that the numbers that Wirecard had published were puzzling and could not have been accurate. Thomas Borgwerth had already recognized this in the autumn of 2013 using a similarly proclaimed Dutch payment service provider Adyen as the foundation for his work. Borgwerth dug deeply into the annual reports of both companies and compared countless metrics with each other. Using quantitative risk analysis, he came to shocking conclusions that no one, however, wanted to hear. The hedge fund, Greenvale Capital, also inquired about dubious transactions but received no response. The Financial Times journalist, Dan McCrum, also posed critical questions that normally an auditor or financial supervisory authority should have. So what happed to this valuable early-warning information? The investigative journalist was targeted by BaFin, Germany's financial supervisory authority, which on April 9, 2019 issued warrants for the arrest of McCrum and fellow journalist, Stefania Palma, on the grounds of suspected market manipulation.

The supervisory authority wasn't really concerned with important issues, such as how Wirecard managed to reel in 4-times the margin its competitors did in this low-margin industry.

When ignorance is knowledge

"I know that I know nothing" is a timeless saying attributed to the Greek philosopher Socrates by the Roman politician and philosopher, Marcus Tullius Cicero. He probably wanted to express that he was lacking wisdom or real, complete knowledge without any doubts. True human wisdom, however, is to be aware of the absence of knowledge in the need for knowledge.

But why is ignorance so difficult to put into words? Simple – it doesn't exist. It is ludicrous that ignorance is often cited as an antonym for knowledge. In today's world, we have knowledge deficiencies (for example, if the universe is infinite or transient or the positive and negative effects of viruses and bacteria in the environment, animals or our bodies), yet at the same time we are being flooded with a tsunami of useless information. The amount of information – but not the amount of useful information – increases by 2.5 quintillion bytes each and every day. As a result, people are having a harder time differentiating relevant signals from all other distractions.

What is essential here is to be aware of the knowledge deficits, weigh different scenarios from various perspectives, and then reach a decision. This requires an assessment culture rooted in interdisciplinary, constructive discourse. The other option is to focus on a (desired) scenario and invent alternative facts. In politics (and, unfortunately, increasingly in the scientific world), this has become more of a rule than an exception. 

Probabilistics makes knowledge more diverse and multifaceted

The way that most companies evaluate risks today illustrates just how lightheartedly uncertainty is often taken. "Experts" assess each risk based on its probability of occurrence and extent of damages as if they had seen the future in a crystal ball. Alternatively, individual possible scenarios (e.g., the worst case) will be filtered from a complete spectrum of potential effects since it appears to fit the respective agenda. There is a presumption of knowledge that does not exist.

Stochastic statements, in contrast, would deliver a scope of potential scenarios. Rasmussen knew that back in 1975 when he wrote his analysis of nuclear risks. We simply do not know which possible surprises lie in the future. Accordingly, risks should be evaluated in an interdisciplinary approach using a range of potential scenarios. Sound risk analysis offers a realistic range for the future development and avoids presumed accuracy and single scenarios that do not exist. The simplest form would assess three scenarios: the worst case, a realistic case and the best case.

The world of stochastics and probabilistic models give us knowledge that is more multifaceted and versatile but no more precise. Rasmussen probably knew that as well while writing his probabilistic risk assessment. The Nobel Prize winner and physicist, Richard P. Feynman, taught the NASA management team the foundations of probability calculations as he examined the causes of the Challenger catastrophe. The NASA management had been aware of both the wear on the O-rings and the escaping combustion gases, yet chose to ignore them.

The stochastic scenario simulation intelligently combines expert knowledge (including intuition and gut instincts) with powerful statistical tools. Statistical thinking leads to higher competency in dealing with uncertainty. Understanding statistics is a necessary skill (not just for risk managers) to classify and evaluate the world in which we live and reach decisions amid uncertainty. To paraphrase the Indian statistician C.R. Rao: Secure knowledge ensues from a new way of thinking that combines uncertain knowledge with knowledge about the extent of the uncertainty. Similar to a statistician, a risk manager should also possess the following four competencies:

  1. Differentiating what is important from what is not
  2. Dealing with risk and uncertainty
  3. Structuring problems and translating them into method-based models
  4. Structuring data and translating it into solutions

Conclusion: We need risk analytics because the future is uncertain

Whoever believes that we don't know the answers because the data is not certain does not understand what risk analysis is. We need risk analysis because the future is uncertain and decisions need to be made in spite of this insecurity.

Sound risk analysis offers a realistic range for the future development and avoids presumed accuracy that does not exist.

In the real world, you will never have perfect information. Risk analysis, therefore, can cope with a poor base of data and aid in creating an optimal analysis of the information available.

Risk managers should stop blaming a black swan as an excuse for their own failures. Instead, they should always be on the lookout for a gray rhino, which is a term for a slow-approaching, highly apparent danger that is all too easily ignored.

Further reading (in German):

  • Feynman, Richard P. (1996): What Do You Care What Other People Think: Further Adventures of a Curious Character, W. W. Norton 1988.
  • Rao, C. R. (1995): Was ist Zufall? Statistik und Wahrheit, Prentice Hall Verlag, München 1995.
  • Renn, O. (2019): Gefühlte Wahrheiten – Orientierung in Zeiten postfaktischer Verunsicherung, Verlag Barbara Budrich, Opladen 2019.
  • Romeike, Frank (2020): Event tree analysis (ETA), in: GRC aktuell, 3. Jahrgang, Dezember 2020, Nr. 4/2020, S. 153-157.
  • Romeike, Frank (2021): Risikowahrnehmungsfalle – Gefangen in einer Welt der „gefühlten Wahrheiten", in: Trend-Dossier, Ausgabe 01/2021, 13. Januar 2020.

Author

Frank Romeike
is the Managing Partner at RiskNet GmbH and the author of several standard works in the field of risk management. Under his direction, RiskNET has grown to become the #1 German competence site for risk management and compliance. Earlier in his career, Romeike served as Chief Risk Officer at IBM, where he contributed to the implementation of a global risk management process and directed several international projects. Romeike, who also serves as an adjunct professor at several national and international universities, ranks among the world's leading experts for risk and opportunity.

 

[This article was first published in the avedos GRC News]

[ Bildquelle Titelbild: Adobe Stock.com / agsandrew ]
Risk Academy

Die Intensiv-Seminare der RiskAcademy® konzentrieren sich auf Methoden und Instrumente für evolutionäre und revolutionäre Wege im Risikomanagement.

Seminare ansehen
Newsletter

Der Newsletter RiskNEWS informiert über Entwicklungen im Risikomanagement, aktuelle Buchveröffentlichungen sowie Kongresse und Veranstaltungen.

jetzt anmelden
Lösungsanbieter

Sie suchen eine Softwarelösung oder einen Dienstleister rund um die Themen Risikomanagement, GRC, IKS oder ISMS?

Partner finden
Ihre Daten werden selbstverständlich vertraulich behandelt und nicht an Dritte weitergegeben. Weitere Informationen finden Sie in unseren Datenschutzbestimmungen.