Myth Buster

Psychology, behavioural economics, market research, website conversion and more by Neal Cole @northresearch
Recent Tweets @northresearch
Who I Follow

image

Business people pride themselves in their decision making and many  businesses embed market and competitor research into this process. However, because business people are prone to the same human frailties as all of us this can discourage the use of research and insight. 

Behavioural science suggests that as people gain experience and knowledge in their area of expertise they have a tendency to become overconfident and complacent about their ability to understand the past and predict the future. Our brains assume that we are living in a simpler, more predictable world than is really the case.

This is one of the most useful insights of behavioural economics and yet professionally a difficult truth to acknowledge when we like to be seen as an expert our field. Indeed, we are sometimes informed that decisions have been made on the advice of an ‘expert’ as if this guarantees the quality of the process. 

As humans we are certainly prone to the illusion of understanding. Our minds create narrative fallacies from our continuous attempt to make sense of the world.

We notice the small number of unusual events that happen rather than the multitude of events that failed to occur.

Our memory is selective and biased by the workings of our mind. We construct vivid accounts of the past based on memories that change every time we recall them but believe they are a true reflection of past events.

We suffer from a tendency to like (or dislike) everything about a person. This  helps generate a simpler and more coherent representation of the world than is really the case. We fill gaps in our knowledge about a person using guesses that fit our emotional response.

Short-term emotions are probably the most powerful force in our decision making arsenal. Many of our judgements and decisions are directly influenced by feelings of liking and disliking rather than rational deliberation.

We hate uncertainty and suppress ambiguity because inconsistencies slow our thought processes and interfere with the clarity of our feelings. People are attracted towards confidence and we prefer decision makers that demonstrate such qualities above someone who may be equally competent but wants to think through a decision before giving an answer.   

People are heavily influenced by the What You See Is All There Is (WYSIATI) rule. We naturally work with the fragmentary information that we have access to as if it were all there is to know. The paradox is that it is easier to construct a consistent story when you have little knowledge. People make fallible guesses from incomplete information by making a leap of faith about how things should work.  Steven Pinker points out our only defence is that it worked sufficiently well in the world of our ancestors.

“Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.” Daniel Kahneman, Thinking, fast and slow.

This can lead us to define our choices too narrowly and consequentially reduces our options. Research and working collaboratively can help by widening our horizons and introducing new insights to challenge our perception of the topic.

We are also very good at changing our beliefs after an unpredicted event without being aware of it. We often unconsciously adjust our view of the world and find it difficult to recall what we believed before the event. This leads us to evaluate the quality of decisions by the nature of the outcome rather than the process by which the decision was made.

“Asked to reconstruct their former beliefs, people retrieve their current ones instead.” Daniel Khaneman, Thinking, fast and slow.

The danger here is that people get blamed for a decision that resulted in a negative outcome despite the unpredictability of the event. In corporate decision making this can result in people relying on bureaucratic solutions to avoid blame which leads to extreme risk aversion.

It can also result in business people receiving unjustified rewards (e.g. bonuses) for being irresponsible with risk taking and just being lucky. This can be seen prior to the 2008 financial crisis where banks and other financial institutions were paying risk takers massive remuneration packages for activities that put the whole financial system at risk. Due to the complexity of some of the assets their AAA ratings proved to be illusory.

Kahneman asserts that any comparison of how successful or not companies have been is to a large extent a comparison between how lucky or not they have been. In every story of a successful company there will have been moments when the destiny of a firm could easily have turned in an instant.

So, is the analysis of the situation more important or is it the process that is they key? Research conducted by Dan Lovallo and Oliver Sibony studied 1,048 major business decisions over 5 years. They found that “process mattered more than analysis by a factor of 6”.

I have no use whatsoever for projections or forecasts. They create an illusion of apparent precision. The more meticulous they are, the more concerned you should be. We never look at projections … —Warren Buffett

This does not mean that analysis is unimportant and should not be undertaken. Rather it should be treated as only part of the jigsaw. When making decisions it is essential that we explore uncertainties and encourage discussion of opinions that may contradict the views of senior stakeholders. 

Intelligence and a high IQ are not normally associated with stupidity. But research suggests that our propensity to make rash, foolish or irrational decisions is often not related to our IQ. No one is immune from making daft decisions and our reliance on IQ and educational qualifications as an indicator of competence can be a recipe for disaster. When the business culture gives too much reverence to people with certain qualifications and skills this can lead to rewarding decisions based mainly upon intuition rather than evidence. 

CAVE Men! Colleagues Against Virtually Everything. People invest a lot of time and effort into their existing strategy or ideas. Dan Ariely calls this the not-invented here bias. People have a tendency to value their own ideas significantly more than others’ ideas. This can result in obsessive focus on poor ideas and probably explains some of the less successful decisions that we come across in business.

Confirmation bias also means that we tend to ignore information that does not align with our existing beliefs. We subconsciously seek and are drawn to evidence that confirms our view of the world. People are very good at overlooking facts that undermine their opinions and will follow the crowd that most closely supports those beliefs. 

So where does this leave the customer insight professional? It demonstrates the need for a comprehensive strategy for promoting the use of insight and collaboration to facilitate innovation and evidence based decision making.

  • Stakeholder management is essential not just to obtain the buy-in and support of senior management, but also to counteract many of the myths about how research and insight is undertaken. 
  • Use storytelling to engage people at an emotional level. Our brains become more active when we are told a story, not only the language processing part of our brain, but also other areas we would normally use to experience the events of the story in real life. Some evidence suggests that our brains can synchronize with the brains of the person telling a story. 
  • Spend time getting to understand your audience and their preconceptions. “What You See Is All There Is” tends to be strongly influenced by survey research when it comes to insight.
  • Never underestimate the importance of how choices are presented and ensure you are fully prepared so that you avoid uncertainty about your recommended approach.   
  •  Immerse yourself in the customer facing side of your business by meeting and observing how your organisation interacts with your customers. Don’t rely on third parties or management to identify the real challenges customer facing staff have to deal with.
  • Identify information gaps to highlight the need for research and insight. Our illusion of understanding sometimes needs reminding how little we really know about the world.
  • Challenge default methods of conducting research. Examine the potential for alternative approaches to insight, including experiments, observation and collaborative methods.
  • Encourage a culture of experimentation. For instance use A/B and multivariate testing on your website to understand what content most engages and motivates your existing and potential customers.
  • To counter hindsight bias always ask key stakeholders before you commission a project what they expect the outcome/findings to be. You can then use these as hypothesis to prove or disprove their views. 
  • Encourage all areas of the business to share insights and engage with the research process. It shouldn’t just be marketing and customer services that buy-in to customer insight. This helps avoid group think by bringing diversity into the decision making process. 
  • The author: Neal Cole has over 20 years experience of working in insight and website optimization for some of the UK’s largest financial services providers, and online companies. Neal is based near Chester and works in London as a conversion specialist for a large online gaming group. He is a regular contributor to the GreenBook Blog market research website. You can follow Neal on Twitter @northresearch and view his LinkedIn profile

Further reading: Thinking, fast and slow by Daniel Kahneman, Herd by Mark Earls (@Herdmeister),  Decoded by Phil Barden (@philbarden), How to Get People to Do Stuff by Susan Weinschenk (@thebrainlady).