How can we measure the value of economic statistics? This explorative study tested ways of measuring the value of official economic statistics through qualitative and survey research with regular users. It identifies a number of challenges in obtaining quantifiable evidence of value from statistics users and recommends the use of case studies that provide measurable findings. A separate case study explored the economic impact of GDP data revisions. The study suggests that the value of economic statistics is very high. A future study could test the effectiveness of providing such information to the public.
In March 2015, the Conference for European Statistics (CES) established a United Nations Economic Commission for Europe (UNECE) Taskforce to define the value of official statistics and develop ways to measure their value. The Taskforce published its first report in 2017. One of its recommendations was for national statistical offices to explore ways of placing a monetary value on official statistics. In this context, ESCoE was commissioned by ONS to test these approaches This was the first time that the UNECE recommendations had been piloted in the UK on official economic statistics. As such, this study was exploratory in nature, focused on identifying promising approaches and ways of mitigating challenges, as well as learning lessons for future studies.
The project had two parts. First, we carried out qualitative research with regular economic statistics users across different stakeholder groups, using a survey, interviews and focus groups. This was conducted across two waves, in 2017-18 and 2021-22, allowing for exploration of any impact of the changes introduced by the ONS in the four years between the waves. Second, we carried out an empirical case study, illustrating the macroeconomic cost of ‘early estimates’ of some key economic variables. We quantified the economic impact of estimates of GDP data using the National Institute of Economic and Social Research’s macroeconomic model, NiGEM.
Through our survey and focus groups, we showed that the value of official economic statistics is very high, most likely far exceeding the cost of actually providing the statistics. Participants attached great importance to ONS economic statistics, saying they were invaluable to their work and in informing policy decisions. In terms of placing a monetary value on economic statistics, we tested a ‘stated preference’ approach, in which we asked participants how much they (or their organisation) hypothetically would be willing to pay to have access to ONS economic statistics. Our study identified several challenges
in applying this approach to economic statistics. One was that most regular users of the statistics do so as employees of an organisation, and they have limited knowledge of their organisation’s budget, and of how their work contributes to improving policy decisions or impacts individual and organisational outcomes. Another strong theme was that participants were somewhat nervous about hypothetical questions about willingness to pay. They said that the idea of paying for economic statistics was not that far-fetched, and participants repeatedly emphasised the value of ONS economic statistics as a public good.
The study recommended the use of case studies that provide measurable findings. A separate case study explored the economic impact of GDP data revisions, by quantifying the benefits to the economy of more accurate real-time data. It identified two episodes when real GDP data were either revised materially higher or lower, and showed that output would have been more stable if the Bank of England’s Monetary Policy Committee had access to data in real time as we see it today. The paper also shows for the period around the Global Financial Crisis that the policy rate would have been lowered earlier and more aggressively, raising output by £12b or 0.8 per cent in 2009. Employment would have been higher by more than 150,000 as a result. Overall, the study suggests that the value of economic statistics is very high.
In our broader discussions about how to value economic statistics, participants found it useful to assess their value by thinking about the counterfactual; that is, what would happen in the absence of ONS economic statistics? Most participants suspected the data in an open market would be less reliable and accountable than official economic statistics.
This project tested the viability of surveying users, alongside using interviews and focus groups, to understand how those users value economic statistics. Our work has helped ONS to deliver new insights on this topic to further international attempts to develop valid and efficient methods to elicit this information from users, who may be either uncomfortable with this topic, lacking in the necessary data, or fearful the exercise is actually seeking evidence to set real charges. The case study evidence we developed can be used as a tool for communicating the value of economic statistics to a wide range of audiences, and for engaging with students.
The UK, on the back of our research, is leading global efforts to develop strong methods for valuing all types of statistical data, not just economic.