Sunday, 28 June 2015

Governments need to get better at recognising when they are wrong

Governments and their civil service advisors need to learn how to admit failure. If they could, their policies would be better and their big capital investment programmes might be more successful. I’m not optimistic.

Apple is sometimes though of as the firm that can do no wrong. Their current position as the most valuable quoted firm on the US stock market is often ascribed to their uncanny ability to create exactly the product consumers really, really want.

This view is wrong and the reason why contains a valuable lesson for governments and their civil service advisors.

I’ll stick with the history of their most successful product to make my point.

When the iPhone was launched I was something of a skeptic on whether it would be successful. I’m a big fan of the Legendary Don Norman (famed for his work on design such as The Psychology of Everyday Things). Norman makes a powerful argument that generalist devices do a worse job of all their tasks than specialist devices. So a computer that tries to be a phone looked like it would be both a bad phone and a bad computer.

Norman was wrong (at least about the iPhone) and so was I.

So, when Apple launched the first iPhone, I didn’t want one, especially at the price. I did eventually buy one but only when the UK suppliers were clearing out their original stock to make way for the iPhone 3G. I think I paid £150, probably 30% of the original launch price. I wasn’t just a skeptic on the features, I really didn’t want to pay their OTT asking price.

The public agreed with me on the price. As, eventually, did Apple, who lopped a third off the original price and, if I remember rightly, offered refunds to some original purchasers to assuage their anger that the price had been dropped.

The important thing is that Apple learned from their error on pricing and, as far as I know, have never had to clear out significant volumes of obsolete stock for any later iPhone model.

Apple has made a whole series of about turns on what looked like set-in-stone features on the iPhone. All this despite the famously and tyrannically opinionated views of Steve Jobs. Originally they were not going to have native apps (it was all going to be WebApps or something). They changed their mind. Originally the form factor was the perfect size for the hand and wasn’t going to change (they based this on solid original research). But they have changed that twice, despite Steve Jobs declaring the original screen size to be perfect.

The point of these changes is that Apple knows how to learn. They don’t have a magical ability to get things right but they really know how to adapt. And they do that quickly. They admit their errors and change. Even while the famously stubborn Jobs was still alive and in charge, they didn’t just stick with what he originally thought of if the evidence said it wasn’t working.

Governments need to learn this skill.

There are two reinforcing pressures that prevent governments from learning. One is the nature of political promises. Political parties base their manifesto promises on the assumption that they know the answers to problems. And the civil servants who advise them when they enter government are promoted and rewarded not for solving problems but for not being seen to fail.

The consequences of failing to admit failure are large. Many (perhaps most) real world problems require a degree of experimentation. Even with the vast effort and the best available research the correct solution to a problem is often far from obvious. Apple may (in retrospect) have radically changed the world of mobile phones but they they only did so after several stumbles and they did so because they were willing to admit their mistakes and change direction.

Private firms have some external discipline to help them. If they continue to resist learning, they will eventually run out of customers and money, limiting the scale of their errors. It isn’t that they make fewer mistakes than governments, but that discipline keeps the scale smaller. Nokia and Blackberry (or RIM as the firm was once know) thought their technical superiority would help them retain market share in mobile phones. They were wrong. But their error didn’t stop the public buying phones, we just buy them from Samsung and Apple. Apple once thought they would change the world of handheld organisers with the Newton, but they had to stop making them before the losses bankrupt the whole firm.

Governments don’t have such external discipline. as a result their mistakes last longer and are bigger in scale. John Kay once described Britain’s attempt to build a new technology (the Advanced gas-cooled Reactor or AGR) for nuclear power generation as the worst public investment in the history of government. The promise was that the UK would have a world beating new technology, invented here and under our control. This could be sold to others and would be a showcase of British technological expertise (unfortunately this came true as the programme was a showcase of how bad Britain’s government is at developing and exploiting new technology). The programme ended having spent perhaps £100bn on a technology that didn’t really work, dwarfing by a factor of 10 the joint French-British investment in the supersonic flight vanity project Concorde.

The AGR programme wasted so much because there was no point where any advisor or any minister wanted to admit it was a failure. It is like a perfect case study to illustrate the sunk-cost fallacy.

Sadly governments are prone to making mistakes that are bigger than they should be because they can’t admit they are wrong. Examples abound: he National Programme for IT in the NHS; The Post Office basic bank account and benefits system; the Crown Prosecution Service’s case tracking system; the Department of Local Government’s FiReControl project to reorganise emergency control rooms and systems. Many of these went wrong for multiple reasons (as documented in The Blunders of our Governments) but an inability to admit errors made them bigger and more damaging.

Policies that don’t have much of a sunk cost also suffer from this delusional assumption of omnicompetence. Tim Harford argues in his book Adapt that, in a complicated world, the only effective way to know what works is to experiment. But this isn’t easy in government. How many manifestos say: we will reform education by trying several different ways of teaching reading and adopt the one that gives the best results? Or: we will try several experiments to test which interventions are most effective at helping people out of poverty?

But even when experiments are sanctioned by government, the motivation of their advisors may undermine their value. The problem is that proper experiments inevitably generate failures. We try several ideas and some work much better than others (if we don’t test a variety we cannot know which is best). So some of them will fail. This admission—that some interventions work better than others—is an essential part of learning. But admitting failure is not in the DNA of most civil servants (at least in the UK). When they do experiments they like to set them up so they can’t fail. Or they do their utmost to avoid admitting they have failed. Either approach utterly inhibits the ability to learn and therefore the ability to improve.

Governments have to get better at this. The world is too complex to be dominated by ideology or by those who think they know the answer before they have tested whether it works.