Skip to content

Bad Decisions

January 24, 2018

This was the title of an interactive seminar, led by Professor Paul Moxey, around the topic of how to avoid cognitive bias – or at least limit its effect on decisions.

“A cognitive bias refers to a systematic pattern of deviation from norm or rationality in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion. Individuals create their own “subjective social reality” from their perception of the input.” (Wikipedia)

The seminar identified many types of cognitive bias, before focusing on three factors.

First, managers in organisations often are encouraged – by culture or by financial rewards – to develop an optimistic cognitive bias, leading to disaster for the organisation. Examples include Kids Company where “The Chief Executive and Trustees relied on wishful thinking and false optimism and became inured to the precariousness of the charity’s financial situation“ and HBOS where “The FCA Final Notice found that the Corporate Division had a culture of optimism, incentivised revenue focus rather than risk and viewed risk management as a constraint on the business rather than essential to it” . This optimism bias is often found in organisations that encourage groupthink rather than encourage challenge or diversity of thought. The Bay of Pigs invasion of Cuba is a classic example.

Second, there will be many more examples of corporate damage or failure arising from a cognitive bias among senior staff. One example discussed was when they hold the belief that all risks from IT system failure should be dealt with by the CTO and his team whereas of course senior staff also have a responsibility. Many current examples of system failure arise from human error, due to insecure handling of passwords leading to unauthorised access, so that hackers can create rogue transactions and flood systems, taking them down. And loss aversion must be responsible for some of the major cost over-runs on IT and other projects before a halt is called. We worked on an anonymised case to discuss the cognitive biases that could have caused failure of governance.

Third, a source of bad decisions, often aligned to cognitive bias, is the use of small samples to derive or support other experimental results, as described by Daniel Kahnemann and Amos Tversky. Many people are innumerate but even those who have many numerical skills quite often fail to spot when this is happening. It was suggested that expressing results as both a number and a percentage helps, either alone can be misleading.

How to guard against cognitive bias and make better decisions? Our list is as follows:

  • Define the problem you are trying to solve and check whether other perspectives see it differently.
  • Use views of the future to develop a set of possible options
  • Test the options against all the stakeholders, not just the actors
  • Look for counter evidence on the future and current position, what are you not seeing?
  • Make a decision and explain in detail to all stakeholders – be prepared to revise if necessary
  • Use a pre-mortem technique – imagine yourself in five years time and the decision had been a disaster – why could that have been? How can you make it successful?

The list looks long but can be implemented within a week, and most decisions can wait a week!

Written by Gill Ringland, SAMI Fellow Emeritus.

The views expressed are those of the author and not necessarily of SAMI Consulting.

If you enjoyed this blog from SAMI Consulting, the home of scenario planning, please sign up for our monthly newsletter at and/or browse our website at

No comments yet

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: