Today’s post will reflect on one of the most profound cognitive biases exhibited by humanity, the anchoring bias. In short, the anchoring bias is the habit we have to filter all new information about a subject through the lens of our existing understanding, often causing us to overlook or even ignore evidence that tells us to change the way we act.

But, before we unpack the anchoring bias any further, I am conscious that a brief explanation of the origins of cognitive bias may be useful.

As a species, we have a unique gift, the ability to use logic and rationality to make decisions.

Or do we?

Research from many different areas of study, including economics and psychology, have repeatedly shown that humans systematically make decisions and choices that completely contradict what logic and probability tell us we should do.

But why?

Evolutionary psychologists will point you in the direction of heuristics, a term used to describe fast but fallible cognitive processes that underpin our decision making. The theory is that over the course of human evolution, the brain became increasingly complex and energy intensive, so to avoid this drain on our energy, nature evolved a way of short-circuiting our decision making, et voila, cognitive biases. The way our brain takes short cuts, ignoring facts and evidence, is a feature of our evolution, not a flaw of an untrained mind.

I think it is important to note here that the term bias often has a negative connotation in day-to-day language. In these posts, however, any reference to a bias is simply a statement of fact, not a judgement on whether it’s existence is good or bad.

Right, back to anchoring.

Imagine you are given a random number between 1 and 100, as are 99 other people. You are then asked what percentage of the UK’s population fail their driving test at the first time of asking.

Studies with scenarios like the above show that those who are given a higher random number tend to predict a higher percentage of failure, with the opposite also being true. Somehow, the participants are anchored to their random number, skewing their answer to a question completely unrelated to it.

For a more detailed synopsis of the anchoring bias please see papers by Lieder et al here and a Furham & Boo here (purchase only).

If we apply that to a work context, the risks are immediately clear. Our estimates of time and cost associated with a new piece of work are skewed by a previous activity. We believe that our existing operating model and flow of work are more optimal than they actually are. We continue with time consuming sign off processes that were created because of an incident a decade ago, even when they’re no longer relevant to our current situation.

All of these things serve to slow down and drive inefficiencies in our working patterns, leading to longer hours, greater stress, a poor work life balance and sometimes even career stagnation.

So what do we do about this?

Well, we can’t change our brain, not overnight anyway, and not when we’re fighting something our brains have evolved to do over hundreds of thousands of years. But we can change the frequency and volume of information at our disposal to reduce the impact of a single anchor. The best way to do this is to run experiments. Short, sharp pieces of work that test hypotheses across multiple locations, populations and departments. The inevitable variation in results you get from this will serve to overcome some of the impacts of anchoring and provide a much more robust data set on which to make a decision.

WTTA is founded on the principle that an experimental organisational culture has the potential to yield significant improvements in the way that we work. This is not least because it serves to dampen the effects of natural biases.

If you would like to know more about how we support individuals and teams to design experiments, please get in touch.