We at WTTA are convinced that for most organisational problems, there is a lot to be gained from adopting a test and learn approach. Creating a hypothesis around a pain point, and then running an experiment to prove whether the hypothesis was right or wrong, provides a low risk but high reward result regardless of the outcome. If the hypothesis is proven, then you’ve moved closer to fixing your problem. If it isn’t, you’ve still gained valuable knowledge.

However, there are a few potential pitfalls along the way. The biggest ones lie in our cognitive biases, the shortcuts and ‘tricks’ our brain plays on us that prevent us from seeing reality as it truly is. One major bias is called the planning fallacy, and it’s a huge risk not just to experimentation, but organisational delivery more generally.

First though, what is the planning fallacy?

In essence, the planning fallacy is our inability as humans to predict the future, which leads to our inability to make accurate plans. We may believe that detailed requirements gathering, estimating and documenting may create a solid plan for how the future will look, but it rarely if ever does. Like ambiguity aversion, it was behavioural economics that first shone a light on this human tendency to overestimate our own abilities.

In 1977, Kahnemany and Taversky wrote a paper suggesting that humans weren’t particularly good at predicting the future. Instead of looking for objective data sets that allow accurate forecasting, we tend to believe that pieces of work can be completed in unrealistic boundaries (time, cost, scope etc) even if we have experienced precisely the opposite before.

This tendency towards optimism is observed irrespective of race, gender, pay grade or ‘intelligence’. It is often destructive, both economically and emotionally. When the plan fails, we tend to believe that the planner has failed. Which begs the question; why has our friend evolution not rid humanity of this burden?

In short – our brain seems to have evolved to perpetuate theplanning fallacy. Let me explain.

If you were to be asked the likelihood of being involved in a car accident, and were subsequently shown that in reality the probability was higher, your brain is less likely to use that information to update your belief system than if your prediction was more pessimistic than in reality.

It turns out that optimism is paramount to a maintaining a healthy mind and healthy body, so despite the negative consequences of poor planning, we must solve for the problem, as the problem will not solve itself.

If any of the above is surprising or you want to learn about it further, I highly recommend this paper by Tali Sharot.

The web is littered with examples of large-scale construction programmes that are delivered years (even decades) later than planned, at costs up to 10 times greater than anticipated.

Bringing it closer to home, large organisations are renowned for investing large sums of money into pieces of work to drive revenue increase or reduce cost, only to generate a small fraction of these benefits over a much wider time horizon that expected. Maybe you, the reader, may be involved in a scenario like this right now. We invest in detailed planning, but our inherent optimism leads to our plans being less accurate. When we run a ‘lessons learned’ process at the end, to find out why reality didn’t match our plans, we don’t learn any lessons, again because of our inherent optimism, and so the cycle continues.

Now, equipped with the knowledge that this planning fallacy will persist in our organisations unless we do something about it, I hope that it becomes clear why experimentation is important.

Experiments serve to draw in the boundaries associated with pieces of work. They are microcosms of a wider activity, executed in advance of commitment of investment. They are time bound to days and weeks rather than months and years. They are not cost prohibitive. We don’t need to invest £10M to demonstrate if a hypothesis is true or not.

As a consequence, the impact of the planning fallacy could be significantly reduced by a well-designed experiment (subject to the data from that experiment used to good effect!).

If we were to establish a culture where experimentation was a pre-requisite to investing in a piece of work, the heartache and negative economic fallout attributed to the planning fallacy could be improved.

If this has resonated and you want some support in building a hypothesis around a problem statement or defining an experiment – we’d love to use the WTTA principles to help you. Get in touch.