Due to sampling and non-sampling errors, economic data, like GDP, are uncertain. Despite this, economic data are frequently published without any direct quantitative indication of their uncertainty. This project considered the measurement of data uncertainty and its communication. Experimental surveys were used to provide guidance to statistical agencies on how the public and ‘experts’ interpret data, and how uncertainty information might best be communicated when publishing national account estimates. Our findings reveal that whether uncertainty information is communicated to the public, and how it is communicated, matters. Of particular significance for statistical agencies, we show that communicating uncertainty information alongside the GDP point estimate affects, and improves, public understanding of why data revisions happen.
This project had two aims: to measure data uncertainty, and then to analyse the reactions of a range of decision makers to communication of this uncertainty, with uncertainty modelled, communicated and visualised in different ways. A key question was whether or not communicating data uncertainty might erode trust in official economic statistics.
Economic data uncertainty is rarely communicated quantitatively. An exception is the ‘fan chart’ for historical GDP growth published at the Bank of England. We used these fan charts, and estimated econometric models of GDP data revisions, to measure, track and evaluate the evolving nature of GDP data uncertainty. We proposed an approach to extract a quantitative measure of unforecastable data uncertainty.
With the aim of assessing empirically if and how the public and more experienced users interpret and understand GDP data uncertainty, we conducted a randomised controlled online experiment with more than 3,000 nationally representative members of the public and a targeted survey of more than 100 experts. To maximise realism, both surveys asked questions about ONS’s latest GDP estimates and headline press release. The specially designed surveys were used to assess perceptions of the uncertainty in single-valued GDP numbers, the public’s interpretation and understanding of uncertainty information communicated in different formats, and how communicating uncertainty affects trust in the data and the producers of the data. By randomly allocating members of the public into one of six groups – a control group presented with no uncertainty information, and five treatment groups presented with different uncertainty information – we identified the causal effects of different ways of communicating uncertainty information. The results can be used to help inform if and how data uncertainty information should best be communicated.
In on-going research we are running an additional wave of our online public survey during the shutdown-induced economic recession due to the coronavirus. This will enable us to identify whether perceptions of data uncertainty and the effectiveness of different communication treatments vary according to the state of the economy and the public’s perceptions of this.
In complementary work we provided an interdisciplinary review of structures and summarised current practice and research in uncertainty communication. This combined statistical and psychological perspectives.
We provided quantitative estimates of unforecastable data uncertainty for UK GDP growth and found strong evidence of changes over time to the GDP growth data revision process. We also found that data uncertainty rises at the onset of recessions and is positively correlated with popular measures of macroeconomic uncertainty. Methodologically, we defined unforecastable uncertainty as a lack of knowledge about the future or the past and showed how our uncertainty measures can be used as the basis for the construction of tests for calibrating probabilistic forecasts. Our experimental results indicated that the majority of the public, like experts, do not take GDP point estimates at ‘face value’. The majority, whether asked about this qualitatively or quantitatively, expect data uncertainty and are not surprised when data are revised.
Importantly, we found that whether uncertainty information is communicated to the public, and how it is communicated, matters. Communicating uncertainty information alongside the GDP point estimate affects, and improves, public understanding of why data revisions happen. It encourages more of the public, rightly, to view the point estimate as just that: a point within a range of possible outcomes. Our randomised experiments also revealed that these quantitative communications of GDP data uncertainty need not reduce trust in statistical agencies. We recommend that uncertainty information is communicated quantitatively using intervals, density strips and bell curves. These are, in general, preferable to textual descriptions and certainly better than no communication at all beyond referring to GDP as an ‘estimate’. Our experiments suggest this will reduce the chance of the public misinterpreting uncertainty information given to them.
By showing how communicating data uncertainties can improve the public’s understanding of data without eroding trust, our research might encourage national statistical offices to do more to measure and communicate data uncertainties. It is also hoped that the work of this project will encourage more empirical research into uncertainty measurement and communication, such as the Comunikos project at Eurostat. Our research has generated widespread interest from central banks.
External project papers
van der Bles, A.M., Van der Linden, S., Freeman, A.L.J., Mitchell, J., Galvao, A.B., Zaval, L. and Spiegelhalter, D. (2019) “Communicating uncertainty about facts, numbers and science“, Royal Society Open Science, Vol 6, Issue 5, The Royal Society Publishing. https://doi.org/10.1098/rsos.181870
Dey-Chowdhury, S., Galvão, A.B., Mahajan, S., Scruton, J. and Walton, A. (2020) “Communicating gross domestic product” 27th April 2020, Office for National Statistics