Category Archives: Uncategorized

Expertise, Ignorance, Naiveté and Risk

One of the prerequisites of good threat and opportunity management is to be knowledgeable about the environment in which you operate. Goal seeking decisions and actions are informed by your sense of the world and your influence over it.

For the past few years there has been a growing wave of evidence showing that, in many cases, the confidence we have in our understanding of the world and our ability to control it is seriously misplaced.

A clear message has been emerging from many fields of research including behavioural economics, neuroscience, physics, biology, psychology, and social science – our ability to view the world as a rational, logical model of cause and effect is often achieved by selectively ignoring the clear evidence to the contrary.

Simple risk management methodologies are predicated on an ability to predict the events which may jeopardise the attainment of your goals, judge the likelihood of those events and estimate the impact of their consequences. The effectiveness of these methods is clearly going to be undermined if our predictions are flawed – and I believe they are flawed far more often than most risk managers (and project managers, and policy directors and sales representatives, and …) are willing to admit.

I’m sure this theme will recur in many future posts and for this entry, I’ll just offer a few recent examples where it is being played out.

In Thinking Fast and Slow – Kahneman references a large body of research to support his distillation of the idea that people are guided by two different thinking models – System 1 which is deeply instinctual and focused on short term survival and System 2 which is rational and able to develop complex, even contradictory mental models. System 1 is the default and it serves us well most of the time while we remain in familiar territory. Occasionally we employ System 2 to handle certain challenging thoughts, but this requires a lot of effort and people are inherently prone to conserve energy in case of emergencies (lazy).

The key to making consistently well informed decisions is to have enough experience with a situation that System 1 is guided by repetitive examples over many years while at the same time knowing when to alert System 2 to consider the possibility of exceptions to the norm.

Since reading the book, I’ve found myself seeing System 1 and System 2 examples in a number of other recent findings (and in doing so, perhaps becoming over confident at my own understanding!) – thanks to my Twitter sources for alerting me to these gems.

Last month, David Ropeik wrote an excellent op-ed piece for the New York Times Sunday Review called “Why Smart Brains Make Dumb Decisions about Danger” which discusses recent findings on the architecture of the brain and why this leads us to be so bad at predicting risk factors.

This month, in a piece recorded for the BBC, Jeremy Wagstaff raised the interesting question – “If we can’t imagine the past, what hope the future” – reminding us as to just how little we know of the past – and that’s supposed to be the easy bit!

An article by Maia Szalavitz in the September issue of Time magazine discusses the psychological factors which might explain “Why Misinformation Sticks and Corrections Can Backfire” – rejecting information takes more mental effort than accepting it. Hardly the characteristics of a trustworthy decision model!

The October 6th issue of the Economist reviews a book by Nate Silver called “The Signal and the Noise: Why So Many Predictions Fail – But Some Don’t” which offers a range of examples and research exploring the ways in which people struggle to think in probabilistic terms and incorporate  uncertainty into their mental models. He suggests that people learn to keep an open mind and adjust their models to reflect available observations.

Finally and perhaps fittingly, consider this review of a book by Oliver Burkeman entitled “The Antidote: Happiness for People Who Can’t Stand Positive Thinking” – learn to enjoy uncertainty, embrace insecurity and become familiar with failure.

Easier said than done!

Expecting to find the unexpected

If you search hard enough through enough information, can you expect to find the unexpected?

A number of angles appeared to me looking into a review of the US DHS’s Counter-Terrorism ‘fusion centres’ in Wired Magazine recently. The simple summary is that the US has spent between  $289 million and $1.4 billion to product ‘a Bunch of Crap’. I have personal experience of national centres although nothing on the scale of the US DHS and can see the possibility of this smelly outcome.

The article discusses reviews by Senate (political) committees into public service activities. The perspective and motivations of such reviews should automatically alert you to clarify what, if any, spin was desired in the outcome. References and ‘facts’ quoted by the reviews help to let the reader form some opinion but just remember that somewhere in another filing cabinet will be a review commissioned by the public servants praising the outcomes of the work. The truth lies (?) somewhere between these two extremes and in my experience, the more clearly the objective evidence is portrayed separate to the subjective, the better. While this is true in government across the world, national security seems to have a particular challenges due to the gravity and sensitivity of the mission and the cloaks which operate around the edges of activities and information.

The idea that the budget expenditure would be between $289M and $1.4B is clearly an indictment of fiscal accountability and transparency. For a smaller country like, say, Australia, the equivalent statement would be that the funding was between $28,000 and $140,000. A big difference, but not one that’s likely to keep anyone awake for too long. It does show the scale of things in the US and obviously suggests that even at this scale, it’s not working.

In my experience, the idea of ‘sharing information’ between a broad set of stakeholders to arrive at a cohesive, more informed, holistic result is extremely difficult. The best ‘fusion’ usually occurs between individuals maintaining personal connections across agency lines. The more bureaucratic the process, the less dynamic and adaptive and inevitably, the lower the fidelity of the complex, dynamic, real world information trail. Agencies would do better by encouraging and facilitating these domain wide communities but that would require lowering the drawbridge into the silo and losing a sense of control and budget.

Even if you can get agencies to collaborate with a shared intent and cooperation, there are technical and methodological challenges in fusing, synthesising, distilling etc. volumes of data which are generated in a wide variety of contexts. Often the military terminology of ‘situation awareness’ and ‘common operating picture’ is used here however the military environment is significantly different to that of national security. Indeed the military environment is one of many environments that come together here including health, emergency management, intelligence, foreign affairs, customs, etc.. No matter how parochial different arms of defence services are, the military has the benefit of a much more tightly integrated command / control structure than will ever exist in the vast domain of national security. Complexity and extreme risk methods are more likely to achieve results than brute force applied to an assumption that answers will just drop out with the right goggles.

The article discusses the diversion of funds to local crime fighting and the twist of the police that local crimes are a form of local terrorism. There are definitely overlaps between the capabilities required for national security and local security. The goal of the national security investment however, should be to improve the national security and, as a by-product, enhance capabilities which can also be useful in other areas of local security, national disaster, pandemics and CBRN accidents.

An alternate strategy is to invest in the capability of capability management. Using this approach, capabilities to handle routine emergencies in many domains are pooled as a portfolio which is configured in response to specific threats. Rather than investing in national capabilities which may never be utilised, this approach focuses the investment on the ability to manage the stakeholder network – something which is taken as a given in typical national initiatives yet which is usually the weakest link in the strategy to execution chain.

Bursting the bubble

This is the first post from a world where we are constantly recalibrating our sense of control and knowledge about the world against the occasional reminder of the underlying complexity and dynamic environment in which we actually live. While impacts from extreme events stir us out of our complacency, we often slip back into the comfort of a naive belief that tomorrow will look like today.