By Ana Galvão and James Mitchell
The ONS recently told us that GDP in the UK contracted by 9.8% in 2020. But economic data like this GDP number are uncertain.
As we previously discussed in our 2019 ESCoE Discussion Paper ‘Communicating Data Uncertainty: Experimental Evidence for U.K. GDP’, GDP estimates are subject to sampling and non-sampling errors. One manifestation of such data uncertainty is data revisions, namely the tendency for history to be rewritten as new information arrives and (understandably) changes our view of past economic performance. There is an inherent trade-off between timeliness and accuracy. Revisions to early GDP estimates should be expected.
Despite the awareness of data uncertainty in statistical offices, GDP data continue to be commonly presented as point estimates. Headline press releases, across statistical offices internationally, focus on single-valued GDP numbers, arguably conveying a misleading degree of accuracy in the underlying data.
To assess if and how the UK public interpret and understand GDP data uncertainty over the period of the pandemic, our latest study extends our previous work by conducting a new, second and expanded, wave of our randomised controlled trial, conducted online. UK GDP saw its worst-ever contraction in the second quarter of 2020. To understand the importance of communicating, or not communicating, data uncertainty, we randomly sampled more than 4,000 adults. The GDP data (values) are communicated to individuals in a trial control group in a format that mimics recent Office for National Statistics (ONS) press releases. They are “simply” told that GDP declined by 21.7% in 2020Q2.
Treatment groups are then randomly presented with alternative qualitative and quantitative communications of GDP data uncertainty. The qualitative “communication tools” involve emphasising verbally that the GDP number is an estimate and likely to be revised. The quantitative communication tools include intervals, density strips and fan charts – as shown below. We also consider confidence intervals around the historical time-series of first estimates of GDP, as recently proposed by ONS. This visualisation of both current and historical data uncertainty is taken directly from the ONS themselves.
Our updated experimental results confirm our earlier result: most of the UK public understand that there is uncertainty inherent in official GDP numbers.
Communicating uncertainty information improves understanding. It encourages the public not to take estimates at face-value, but does not decrease trust in the data. Quantitative tools to communicate data uncertainty – notably intervals, density strips and bell curves – are especially beneficial. In particular, we find that they anchor the public’s, at times wild, subjective probabilistic expectations of data uncertainty, improving alignment with objective estimates derived from modelling revisions to GDP growth estimates. This should facilitate improved decision making, at least to the degree the public’s expectations of data uncertainty better anticipate future data revisions.
We do find that the effects of the quantitative communication tools are stronger for those members of the public who understand what GDP is and who trust the ONS. This suggests scope to redesign statistical and data communication strategies to maximise their impact. It also suggests that there is an opportunity to tailor communication strategies to specific audiences.
The ONS already emphasise the uncertainty of early GDP data releases both by indicating that their data will be revised and by making real-time datasets available. But our results suggest that there will be positive pay-offs for the ONS, and other communicators of data, by expanding their efforts to provide direct and explicit visual communications of data uncertainty for variables like GDP, contrary to international practice in headline press releases. These efforts might build on ONS’s exciting work to communicate GDP data uncertainty via confidence intervals.
Our results for economic statistics are therefore consistent with emerging inter-disciplinary evidence that providing quantitative uncertainty information – “margins of error” – leads to a better understanding of the range of possible outcomes and better decisions by non-experts. But, importantly, it need not erode trust in the data themselves.
A full version of this Discussion Paper is available here.
The views expressed herein are those of the authors and not necessarily those of the Federal Reserve Bank of Cleveland or the Federal Reserve System.
ESCoE blogs are published to further debate. Any views expressed are solely those of the author(s) and so cannot be taken to represent those of the ESCoE, its partner institutions or the Office for National Statistics