In a previous blog “There’s more to decision making than meets the eye… or why we shouldn’t dismiss gut feelings“, inspired by Malcolm Gladwell’s book ‘Blink’1, I made a case for the discretionary use of intuition in decision making. I argued that:
- There seems to be a particular role for intuition when: a) encountering very new or different options for which known criteria are just not valid; b) where decisions based on intuition just cannot be explained in a logical way
- There are circumstances where it would be quite risky to rely on one’s intuition: a) when under tremendous stress; b) when there is just too much information to be digested; c) where our subconscious ‘houses’ prejudices that we are not conscious of.
Comments from my readers suggested that other practitioners of Lean and Six Sigma also see a role for intuition alongside factual based analysis of problems and root causes, in the evaluation of potential solutions, and in decision making.
Having now read Ben Goldacre’s book ‘Bad Science’2, I have some new reflections to add to this discussion. Goldacre cites three main problems with intuition.
1. Our brains are conditioned to look for and ‘see’ patterns, and causal relationships where there may be only random noise.
Goldacre gives examples of random sequences of numbers that, when presented to people, ‘reveal’ clusters and patterns when statistical analyses would show that none exist.
The ability to rapidly and intuitively spot patterns of activity, and causal relationships between them, may, in the past, have been an important survival mechanism for humans, but could today be very misleading in process improvement where, for example, we want to make sure that we focus our efforts on addressing the truly significant problems.
Approaches such as Pareto analysis, quantification of issues (or Undesirable Effects – UDEs) and matrix diagrams can help us to review data more objectively and thereby focus on the right things.
2. We have a bias towards positive evidence.
In the words of Francis Bacon, quoted by Goldacre: “It is the peculiar and perpetual error of human understanding to be more moved and excited by affirmatives than negatives.”
We are much more likely to pay attention to findings that prove our theories, than to those that do not. That is why, in another quote in Goldacre’s book, Darwin made a point of noting every piece of negative evidence that he came across.
Goldacre expands on this bias further by saying that we:
- Overvalue information that confirms our hypotheses
- Seek out information that will confirm our hypotheses
Our natural bias towards positive evidence is also why process improvement and change management exercises such as force-field analysis, SWOT (Strength, Weakness, Opportunities, Threats) analysis, FMEA (Failure Mode Effect Analysis) and Six Thinking Hats can be so powerful. Knowledge Management practitioners also make a point of capturing ‘deltas’ or ‘what could be improved’ in learning reviews, retrospects or ‘After Action Reviews’
These tools, when applied to process improvement and decision making, encourage us to think about what might prevent our solutions from succeeding rather than getting carried away by how wonderful they are! They also help us to present this understanding more clearly in our communication activities or dialogues with our stakeholders (sponsors, colleagues and customers).
3. Our assessment of the quality of new evidence is biased by what we believe.
If we are aware of this potential pitfall we can aim to be more receptive to opposing views. In a team of people that have been working together for some time, common beliefs may be more predominant than instances of opposing views.
An effective team leader could look out for and encourage differences of opinion as a potential way of overcoming the team’s bias in assessing new evidence. Discussions with customers, suppliers and other stakeholders could also be very powerful for this.
Now that we know that we can be additionally blinded by our need to see patterns, causal relationships, and confirmatory evidence of what we believe, we need to be doubly cautious in applying intuition for process improvement and decision making.
As change practitioners know, we value resistance from stakeholders as this highlights potential areas for consideration that those implementing the change may be blind to. We know now that we should also value resistance from stakeholders as a counter-balance to the risks of intuition.
However, we should continue to bear in mind that there is a role for intuition in certain circumstances.
(1) “Blink. The power of thinking without thinking” by Malcolm Gladwell, Back Bay Books, 2007
(2) “Bad Science” by Ben Goldacre, Harper Perennial, 2009
(3) Elisabeth Goodman is Owner and Principal Consultant at RiverRhee Consulting, a consultancy that uses process improvement and knowledge management to enhance team effectiveness.