Pardon the Error

There can be few more pervasive and life-changing decisions in recent years than the choice of fiscal austerity. Hardly any government has been allowed to continue borrowing and spending: millions have been laid off, countless contractors have gone to the wall, and we have all endured the chill of recession.

Even now public debt and further cuts to public spending are a major issue in the UK’s General Election campaign prior to voting on 7 May 2015, and the future of the Greek economy is a limiting factor in the economy of the whole of Europe.

Although not the sole evidence driving these swingeing cuts, one of the most influential pieces of research that has kept politicians to their painful course is that of Reinhart and Rogoff (2010).

Analysing national economic data, they concluded that countries with public debt over 90% of their GDP showed significantly lower growth rates. The last time the UK was in that danger zone was 1965, and from 1987 onwards, with the tough policies of Thatcherism, we had kept close to the idyllic zone of 30% associated with highest growth outcomes.

However from 2008 our debt climbed alarmingly, reaching 60% of GDP in 2011: without a stiff dose of austerity we could have rapidly passed into danger.

In April 2013, Herndon, Ash and Pollin here re-analysed Reinhart and Rogoff’s study, identifying a basic error in the Excel spreadsheet used to manipulate the data, unexplained data omissions, and controversial weighting. They concluded that any fall-off in growth rate does not occur until debt reaches 120% of GDP, the crazy realms of the Greek economy, and hardly likely in the UK without sustained government incompetence.

Reinhart and Rogoff have accepted their Excel blunder, but stuck to their other guns, and no doubt the many governments and international institutions that have relied on their papers will prove equally unrepentant.

It is this sort of issue that worries me about Simon Rogers, founder of the Guardian Datablog and now Data Editor at Google, and ‘data journalism’. His book Facts are Sacred holds that fascination inherent in numeric compilations like Wisden, but appears riven by the naïvety found in those who have worked little with real experimental or observational data.

To equate measured or reported data, or implict relationships within data, with facts reveals a dangerous overconfidence. In 40 years of measuring simple and abstruse variables, I have learned humility in my own data, and suspicion in those of others.

First you must know and allow for the error in measurement itself; in the case of economic indicators, drilling down to discover how the figures are generated makes you realise that many are little more than educated guesses, often compiled by governments, organisations, and companies with vested interests in what the data purportedly show.

Furthermore you need to know not just the single best estimator (usually a mean or median) of each data point, but the spread about each point resulting from error and variance. Simon Rogers’ data ‘facts’ seem worryingly free of spread or confidence intervals, but enticingly crisp.

Even when armed with reliable data, it is so simple to draw misleading conclusions. When I was first learning serious statistics, I observed an exceptionally strong relationship between the number of colour TVs in the UK and the suicide rate. It would have been to easy to assume that relationship was causal, in the same way that others have assumed that very high debt ratios cause low growth rates.

Aside from this now infamous Excel error, looking at the scatter of data in Herndon, Ash & Pollin’s analysis there are clearly many economies which have enjoyed high growth rates at times that they have also had very high debt ratios.

In reality, any association seems too weak to bank your economy on. And are we confident that economists who fail to spot their computational cock-ups are a steady hand guiding the tiller of international fiscal policy?

Updated from the original, which was first published in MacUser volume 29 issue 07, 2013. It really had a big impact on government policy, then…