Learning the Lessons from the Atkinson Review


Learning the Lessons from the Atkinson Review



By Richard Heys


When I joined the Civil Service, fresh out of university, the Government of the day, as with all governments, was focused intently on how effectively the public services were being delivered. Economists across Whitehall were tasked with thinking deep thoughts on questions of efficiency and effectiveness, aka productivity.

The problem was that the only overarching framework for measuring success or otherwise was the one contained in the National Accounts. In fact, up to the 1990s, that framework had been one of output equals cost of inputs. Public service output was deemed to be whatever you spent on it. That was clearly unhelpful from a productivity perspective as it meant productivity was by definition always unchanged – which did not accord with the real world. So, the international national accounting fraternity moved away from this system and encouraged statistical offices to find direct ways to measure items such as health service or schools output.

ONS enthusiastically embraced this endeavour but its implementation of the agenda was problematic. It led to implausible results which threatened not only the credibility of the public service productivity measures but even the UK National Accounts themselves. This led the then National Statistician, Len Cook, to ask Tony Atkinson, one of the UK’s leading academic economists to undertake a review. Supported by a strong team drawn from the Government Economic and Statistical Services, and the Bank of England, led by Joe Grice, Tony’s 2005 review became the seminal work in the field, not just in the UK, but also internationally. It is a testament to the quality and depth of the thinking undertaken that the Office for National Statistics is still, in 2019, fielding regular visiting by statisticians from other countries, and being invited to speak at prestigious international events, to discuss this approach to the subject of measuring public service output.

Tony’s report had two key recommendations. One was to argue for measurement to be underpinned by a set of principles. The previous difficulties had arisen from opportunistic choosing of likely-looking indicators, without any underlying principled framework, which then turned out for various reasons to be problematic. His second crucial point was that public services output should be measured in a way consistent with that used for private sector output. Specifically, the aim should be to measure the value added by the public services.

This last point had an important implication. Public service output has to reflect not only quantity – how many patients are treated or how many pupils are taught – but also the quality dimension: how much value is added in treating those patients or teaching those pupils. Tony was clear, that measuring public services output had to have regard to how the outcome had been improved as a direct result of that public service.  Of course, this is not altogether straightforward to achieve but it is a subject on which health or education specialists and practitioners spend a great deal of time. So, there is plenty of material to assist informed measurement.

The ONS eagerly took up this challenge and over the last 14 years has worked to try to deliver statistics on public service productivity which take into account Tony’s insight, recognising that the European System of Accounts, which is legally binding on the UK while it is a member of the EU,  prevents the ONS applying these adjustments in the National Accounts. In a recent  paper (ESCoE DP 2019-11) written with Fred Foxton, Joe Grice and James Lewis I review the achievements and challenges which statisticians in the metaphorical trenches have delivered or had to overcome in this period. This journey hasn’t been simple. Early on the two largest services, health and education, were quality adjusted, but today still only around half of the public services are ‘quality adjusted’ and due to ever-changing institutional and policy designs these adjustments need to be kept constantly reviewed and refreshed. (This would have been no surprise to Tony Atkinson . He specifically recommended that periodic reviews would be needed, for just these reasons.)  Whilst in the last two years ONS has been able to create new quality adjustments for the criminal justice system and adult social care sectors, key gaps still remain, for example in children’s social care, policing, fire, defence and public administration. Similarly maintaining data access, as data collections evolve and change, has presented challenges. Nevertheless, the simple fact is that quality adjustment has fundamentally changed our understanding of the public services. Drawing from the latest ONS publication (figure 11, Public service productivity: total, UK, 2016) we can see, without quality adjustment, we would believe that since 1997 productivity has fallen, whereas with quality adjustment productivity had grown.

Equally, with methods to quality adjust the 50% of services which are not quality adjusted, assuming this group grew at the same pace as the average of those which are quality adjusted, this would suggest growth would double from 0.2% p.a. over the last twenty years to 0.4% p.a.

The paper asks two big questions: what lessons have been learnt from attempting to apply Atkinson, and what should we look to do in the future?

Probably the key lesson learnt is that whilst conceptually strong, the devil has often been in the detail, where we have identified the following material challenges:

  • How should various aspects of quality be objectively valued and weighted against one another?
  • How should different quality-adjusted services be weighted together?
  • How do we keep pace with the rate of technological change?
  • Should we following individuals or use aggregate data?
  • What do we do when a change in policy affects the measure?
  • How do we trade off consistency of estimates with different needs for data in relation to devolved matters?

The paper tries to summarise how ONS has tackled each of these, but looking to the future a further question emerges. Everyone knows that GDP is not a good measure of welfare, but with the public sector being 20% of the economy, the change in quality of the outcomes which are reflected here are substantial and the argument for including them within the National Accounts needs to be refreshed and reinforced.

However, the public services are not the way society drives some of the outcomes under consideration. Atkinson was clear the public services should only be adjusted by the element of any quality change which could be directly attributed to the public sector’s contribution. Lifestyle decisions, like whether to smoke or go the gym, can have an equally important impact on health life expectancy as the contribution from health services. As we increasingly strive to understand the role of these social factors, which are outside GDP, the paper suggests that measurement of their impact need to build on the lessons learnt in Atkinson’s review.

ESCoE blogs are published to further debate.  Any views expressed are solely those of the author(s) and so cannot be taken to represent those of the ESCoE, its partner institutions or the Office for National Statistics.

About the authors

Richard Heys

Related publications