The problem with relying on intuition for process improvement and decision making.

In a previous blog “There’s more to decision making than meets the eye… or why we shouldn’t dismiss gut feelings“, inspired by Malcolm Gladwell’s book ‘Blink’1, I made a case for the discretionary use of intuition in decision making.  I argued that:

  1. There seems to be a particular role for intuition when: a) encountering very new or different options for which known criteria are just not valid; b) where decisions based on intuition just cannot be explained in a logical way
  2. There are circumstances where it would be quite risky to rely on one’s intuition: a) when under tremendous stress; b) when there is just too much information to be digested; c) where our subconscious ‘houses’ prejudices that we are not conscious of.

Comments from my readers suggested that other practitioners of Lean and Six Sigma also see a role for intuition alongside factual based analysis of problems and root causes, in the evaluation of potential solutions, and in decision making.

Having now read Ben Goldacre’s book ‘Bad Science’2, I have some new reflections to add to this discussion.  Goldacre cites three main problems with intuition.

1. Our brains are conditioned to look for and ‘see’ patterns, and causal relationships where there may be only random noise.

Goldacre gives examples of random sequences of numbers that, when presented to people, ‘reveal’ clusters and patterns when statistical analyses would show that none exist.

The ability to rapidly and intuitively spot patterns of activity, and causal relationships between them, may, in the past, have been an important survival mechanism for humans, but could today be very misleading in process improvement where, for example, we want to make sure that we focus our efforts on addressing the truly significant problems.

Approaches such as Pareto analysis, quantification of issues (or Undesirable Effects – UDEs) and matrix diagrams can help us to review data more objectively and thereby focus on the right things.

2. We have a bias towards positive evidence.

In the words of Francis Bacon, quoted by Goldacre: “It is the peculiar and perpetual error of human understanding to be more moved and excited by affirmatives than negatives.”

We are much more likely to pay attention to findings that prove our theories, than to those that do not. That is why, in another quote in Goldacre’s book, Darwin made a point of noting every piece of negative evidence that he came across.

Goldacre expands on this bias further by saying that we:

  1. Overvalue information that confirms our hypotheses
  2. Seek out information that will confirm our hypotheses

Our natural bias towards positive evidence is also why process improvement and change management exercises such as force-field analysis, SWOT (Strength, Weakness, Opportunities, Threats) analysis, FMEA (Failure Mode Effect Analysis) and Six Thinking Hats can be so powerful.  Knowledge Management practitioners also make a point of capturing ‘deltas’ or ‘what could be improved’ in learning reviews, retrospects or ‘After Action Reviews’

These tools, when applied to process improvement and decision making, encourage us to think about what might prevent our solutions from succeeding rather than getting carried away by how wonderful they are!  They also help us to present this understanding more clearly in our communication activities or dialogues with our stakeholders (sponsors, colleagues and customers).

3. Our assessment of the quality of new evidence is biased by what we believe.

If we are aware of this potential pitfall we can aim to be more receptive to opposing views.  In a team of people that have been working together for some time, common beliefs may be more predominant than instances of opposing views.

An effective team leader could look out for and encourage differences of opinion as a potential way of overcoming the team’s bias in assessing new evidence.  Discussions with customers, suppliers and other stakeholders could also be very powerful for this.

Conclusion

Now that we know that we can be additionally blinded by our need to see patterns, causal relationships, and confirmatory evidence of what we believe, we need to be doubly cautious in applying intuition for process improvement and decision making.

As change practitioners know, we value resistance from stakeholders as this highlights potential areas for consideration that those implementing the change may be blind to.  We know now that we should also value resistance from stakeholders as a counter-balance to the risks of intuition.

However, we should continue to bear in mind that there is a role for intuition in certain circumstances.

Notes

(1) “Blink.  The power of thinking without thinking” by Malcolm Gladwell, Back Bay Books, 2007

(2) “Bad Science” by Ben Goldacre, Harper Perennial, 2009

(3) Elisabeth Goodman is Owner and Principal Consultant at RiverRhee Consulting, a consultancy that uses process improvement and knowledge management to enhance team effectiveness.

Follow the links to find out more about RiverRhee Consulting, and about Elisabeth Goodman.

9 thoughts on “The problem with relying on intuition for process improvement and decision making.”

  1. The great value of Six Sigma, for me, is the requirement to use a defined model, either DMAIC or DFSS, and have data to back it up. Lean provides a very new way of looking at processes, work, life, and so on — at least, new for most western thinkers. 20 years ago! Says something about resistance that for a lot of people it’s still revolutionary.) Other approaches, like Theory of Constraints, for example, bring other value. So does Gladwell. You point this out very clearly.

    While I have talked about Gladwell’s book and his insights, my take has been slightly different. It really depends on learning style and cognitive function on an individual level. Strongly intuitive types, like me, have an intellectual responsibility to know our intuition as a double-edged blade, and strive to use critical thinking, logic and evidence to examine our intuitions. Strongly experiential and evaluative types can cultivate their intuition as well as honoring it in other people. They provide perspectives to look at problems; the effective person will be able to move, to alter the perspective and in fact will do so on purpose, as part of their own internal SWOT framework.

    1. elisabethgoodman

      Mike,

      you make some very good points. I particularly like the way you describe the importance of understanding our personal styles or levels of intuition and how to balance them accordingly with factual approaches.

      Thanks for sharing.

      Elisabeth

  2. One of the questions here is how subjectivity works in groups. It would seem likely that the quality of a group’s intuition depends partly on the cohesion of the group. If the group is brittle, then the intuition may also be brittle.

    So in order to trust the intuition of the group, we may need to pay attention to the dynamics of the group and/or the cultural context.

    I’ve started a discussion on the Linked-In lenscraft group.

    1. elisabethgoodman

      Richard – yes this makes a lot of sense. Although conversely, in a strongly knit group, there is a greater risk of having less questioning of assumptions…

  3. Hi Elisabeth,

    Your blog caught my eye because of the mention of Gladwell, “Blink,” and how intuition affects how people make decisions. I am writing this response in case you or your readers are interested in a few additional resources on the subject.

    Several of the stories and examples in Blink come from research Malcolm learned about through conversations with my former employer, Gary Klein. Klein has been studying decision making for over 30 years and is a pioneer in understanding why and how some people make good and effective decisions under stress, time pressure, ambiguous and conflicting situations. His model of decision making in such circumstances in known as the recognition-primed decision making model.

    http://www.fastcompany.com/magazine/38/klein.html

    Klein’s work describes a phenomenon, that is, how some people are able to make good decisions under the situations characterized by the challenges I have just mentioned. The gist of the RPD model is that some people don’t make a “choice” about using intuition. They just act in certain situations. The reason they are able to act, without going through traditional decision making processes, is because they have developed a repetoire of experiences, strategies, short cuts, that come to mind when they see a familiar situation. This enables them to rapidly make sense of what is going on and know how to respond in a way that seems instinctual to the rest of us. Likewise, that repetoire helps them spot problems and detect anomalies when things don’t look “right” or how they should. These are hallmarks of expertise.

    What is happening is that these people are relying on their expertise to make quick decisions. This RPD model was created by studying soldiers, pilots, critical care nurses, fire fighters, nuclear power plant control room operators, and other people working in professionals who face split second decisions every day and don’t have the time for formal analyses. RPD doesn’t argue that intuition is the best way to make decisions, it simply explains how people (experts) make good decisions under time pressure and stress.

    The pitfalls you’ve outlined above have been identified by researchers studying these phenomenon, and many of them are related to the process of sensemaking, or how we size up our environments and make sense of our surroundings, which then provides the context for making decisions. I co-authored a book chapter with Gary Klein that talks about how we can be led astray by our sensemaking processes.

    http://www.amazon.com/Expertise-Out-Context-International-Naturalistic/dp/0805855106#reader_0805855106

    A few points from the book chapter: The assertion that we look for and see patterns in data is absolutely true. The key to making sense of data, and not being led astray by it, is to know and understand the context in which the data is being presented. In the example you cite above, looking for patterns in random streams of data probably does result in failed connections and misinterpretations. In order for people to make sense of data appropriately, they must be working to make sense of that information within some context, task or goal by which they have sufficient expertise. Another pitfall in sensemaking is creating a story or hypothesis early on – and then interpreting subsequent data to that mental model or frame, which you have captured in points 2 and 3. In the research that predated our book chapter, this was one of the key items we studied: how are people able to break those frames and remain open to possibilities? I am not sure we arrived at a sufficient answer to that research question yet.

    Klein’s work is very interesting. He has written Sources of Power: How People Make Decisions which much more eloquently describes the points I tried to make above. His second book, The Power of Intuition, explains his unique definition of intuition. They are great reads and full of neat stories. He even suggests exercises to combat many of the errors in human judgement that often get us all.

    What I learned about decision making helped me become a better change management and OD practitioner. I realized that intuition is not magical or mystical. What we call intuition is a label for our observed phenomena that is really just a function of expertise, and expertise is something that we can study, elicit, and use in our knowledge management and change work. And decision analysis has its place in helping to check our gut instincts, provided we have the time and resources to use the tools appropriately.

    1. elisabethgoodman

      Deb, this is wonderful input. I really appreciate you taking the time to write such a detailed and informative response. I’m sure my other readers will be very interested in it too, and I will certainly look up some of the references you quote. Thank you!

  4. Pingback: Intuition revisited – implications for process improvement and Lean Six Sigma (Part 2 of 3 blogs) | Elisabeth Goodman's Blog

  5. Pingback: How learning to draw can make you better at solving problems | Elisabeth Goodman's Blog

  6. Pingback: Decision making. Noise, intuition and the value of feedback. | Elisabeth Goodman's Blog

Leave a Comment