Expertise, Ignorance, Naiveté and Risk

One of the prerequisites of good threat and opportunity management is to be knowledgeable about the environment in which you operate. Goal seeking decisions and actions are informed by your sense of the world and your influence over it.

For the past few years there has been a growing wave of evidence showing that, in many cases, the confidence we have in our understanding of the world and our ability to control it is seriously misplaced.

A clear message has been emerging from many fields of research including behavioural economics, neuroscience, physics, biology, psychology, and social science – our ability to view the world as a rational, logical model of cause and effect is often achieved by selectively ignoring the clear evidence to the contrary.

Simple risk management methodologies are predicated on an ability to predict the events which may jeopardise the attainment of your goals, judge the likelihood of those events and estimate the impact of their consequences. The effectiveness of these methods is clearly going to be undermined if our predictions are flawed – and I believe they are flawed far more often than most risk managers (and project managers, and policy directors and sales representatives, and …) are willing to admit.

I’m sure this theme will recur in many future posts and for this entry, I’ll just offer a few recent examples where it is being played out.

In Thinking Fast and Slow – Kahneman references a large body of research to support his distillation of the idea that people are guided by two different thinking models – System 1 which is deeply instinctual and focused on short term survival and System 2 which is rational and able to develop complex, even contradictory mental models. System 1 is the default and it serves us well most of the time while we remain in familiar territory. Occasionally we employ System 2 to handle certain challenging thoughts, but this requires a lot of effort and people are inherently prone to conserve energy in case of emergencies (lazy).

The key to making consistently well informed decisions is to have enough experience with a situation that System 1 is guided by repetitive examples over many years while at the same time knowing when to alert System 2 to consider the possibility of exceptions to the norm.

Since reading the book, I’ve found myself seeing System 1 and System 2 examples in a number of other recent findings (and in doing so, perhaps becoming over confident at my own understanding!) – thanks to my Twitter sources for alerting me to these gems.

Last month, David Ropeik wrote an excellent op-ed piece for the New York Times Sunday Review called “Why Smart Brains Make Dumb Decisions about Danger” which discusses recent findings on the architecture of the brain and why this leads us to be so bad at predicting risk factors.

This month, in a piece recorded for the BBC, Jeremy Wagstaff raised the interesting question – “If we can’t imagine the past, what hope the future” – reminding us as to just how little we know of the past – and that’s supposed to be the easy bit!

An article by Maia Szalavitz in the September issue of Time magazine discusses the psychological factors which might explain “Why Misinformation Sticks and Corrections Can Backfire” – rejecting information takes more mental effort than accepting it. Hardly the characteristics of a trustworthy decision model!

The October 6th issue of the Economist reviews a book by Nate Silver called “The Signal and the Noise: Why So Many Predictions Fail – But Some Don’t” which offers a range of examples and research exploring the ways in which people struggle to think in probabilistic terms and incorporate  uncertainty into their mental models. He suggests that people learn to keep an open mind and adjust their models to reflect available observations.

Finally and perhaps fittingly, consider this review of a book by Oliver Burkeman entitled “The Antidote: Happiness for People Who Can’t Stand Positive Thinking” – learn to enjoy uncertainty, embrace insecurity and become familiar with failure.

Easier said than done!

One thought on “Expertise, Ignorance, Naiveté and Risk

  1. Pingback: Grasping for goals, reaching for the straws | Extreme Impacts

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s