Modelling and Communicating Data Uncertainty


Modelling and Communicating Data Uncertainty


Due to sampling and non-sampling errors, economic data, like GDP, are uncertain. Despite this, economic data are frequently published without any direct quantitative indication of their uncertainty. The projects undertaken as part of this research programme considered the measurement of data uncertainty and its communication. Experimental surveys were used to provide guidance to statistical agencies on how the public and ‘experts’ interpret data, and how uncertainty information might best be communicated when publishing economic estimates. Our findings reveal that whether uncertainty information is communicated to the public, and how it is communicated, matters. For instance, we show that communicating uncertainty information alongside the GDP point estimate or alongside estimates comparing productivity between the UK and other G7 countries improves public understanding of data uncertainty, and it does not reduce trust in the statistical office. Overall, the results call for greater communication of data uncertainty.


These projects had two aims: to measure data uncertainty, and then to analyse the reactions of a range of decision makers to communication of this uncertainty, with uncertainty modelled, communicated and visualised in different ways. A key question was whether or not communicating data uncertainty might erode trust in official economic statistics.

Economic data uncertainty is rarely communicated quantitatively. An exception is the ‘fan chart’ for historical GDP growth published at the Bank of England. For our project on GDP uncertainty, we used these fan charts, and estimated econometric models of GDP data revisions, to measure, track and evaluate the evolving nature of GDP data uncertainty. We proposed an approach to extract a quantitative measure of unforecastable data uncertainty.

With the aim of assessing empirically if and how the public and more experienced users interpret and understand GDP data uncertainty, we then conducted a randomised controlled online experiment, to test the effectiveness of communicating uncertainty. We have replicated this approach in subsequent studies, including an additional wave on GDP and on international comparisons of UK productivity.


We conducted online surveys as randomised controlled experiments. Implemented by Dynata, they take a representative sample of the UK population (across age, gender, and region using a quota sample). The number of respondents ranged between 3,000-4,000 people per survey across the different projects and waves of research. Respondents are randomly allocated into seeing different communication tools: a control group (presented with no uncertainty information) and a number of treatment groups (presented with uncertainty information). The surveys were used to assess perceptions of uncertainty, the public’s interpretation and understanding of uncertainty information communicated in different formats, and how communicating uncertainty affects trust in the data and the producers of the data. The randomisation lets us identify the causal effects of different ways of communicating uncertainty information, so the results can be used to help inform if and how data uncertainty should best be communicated.


Our work on GDP provided quantitative estimates of unforecastable data uncertainty for UK GDP growth and found strong evidence of changes over time to the GDP growth data revision process. We also found that data uncertainty rises at the onset of recessions and is positively correlated with popular measures of macroeconomic uncertainty. Methodologically, we defined unforecastable uncertainty as a lack of knowledge about the future or the past and showed how our uncertainty measures can be used as the basis for the construction of tests for calibrating probabilistic forecasts.

We then used two waves of a randomised controlled trial, with a combined sample of more than 7,000 adults, representative of the UK population. It finds that most of the UK public does not actually take initial GDP point estimates at face-value. They attribute a degree of inaccuracy and uncertainty to single-valued GDP numbers, as commonly communicated in headline data releases.

Our key finding is that communicating uncertainty information alongside the GDP point estimate improves the public’s understanding of data uncertainty, but does not reduce its trust in the statistical office. It encourages more of the public to view the point estimate as just

that: a point within a range of possible outcomes. We recommend that uncertainty information is communicated quantitatively using intervals, density strips and bell curves. These are, in general, preferable to textual descriptions and certainly better than no communication at all beyond referring to GDP as an ‘estimate’. Our experiments suggest this will reduce the chance of the public misinterpreting uncertainty information given to them.

Our work on international comparisons of UK productivity used a similar approach (an online survey experiment with 3,500 respondents) and came to similar conclusions. The online survey results support the proposed communication tools as an effective way of conveying the uncertainty on the estimates of the international comparison of productivity for the UK public. They are effective even for respondents with limited knowledge of what productivity is. But communication tools are likely to be more helpful to members of the public that are familiar with the concept as they are better at making an inference based on the communicated data.


By showing how communicating data uncertainties can improve the public’s understanding of data without eroding trust, our research might encourage national statistical offices to do more to measure and communicate data uncertainties. It is also hoped that the work of this project will encourage more empirical research into uncertainty measurement and communication, such as the Comunikos project at Eurostat. Our research has generated widespread interest from central banks and from the ONS, including informing their publication on international comparisons of UK productivity.


Ana Galvão and James Mitchell (2021) ‘Communicating Data Uncertainty: Experimental Evidence for U.K. GDP’ ESCoE Conference on Economic Measurement 2021 Poster Exhibition, 11-13 May 2021. Poster Presentation

Galvão, A.B and Mitchell, J. (2021) ‘Communicating Data Uncertainty: Experimental Evidence for U.K. GDPRES Annual Conference 2021, Special Session ‘Improving the communication of economics and economic statistics’ 12-14 April 2021 (Presentation starts at 19:38)

Ana Galvão, James Mitchell and Johnny Runge ‘Communicating Data Uncertainty: Experimental Evidence for U.K. GDP’ ESCoE Conference on Economic Measurement 2020, Contributed Session G: GDP Measurement and Uncertainty, 16-18 Sep 2020 (Presentation starts at 02:00)

Galvão, A.B and Mitchell, J. (2020) “Real-Time Perceptions of Historical GDP Data Uncertainty” Warwick Business School Finance Group, Macroeconomic Policy and Forecasting, Working Paper Series No. 35 July 2020

Dey-Chowdhury, S., Galvão, A.B., Mahajan, S., Scruton, J. and Walton, A. (2020) “Communicating gross domestic product” 27 April 2020, Office for National Statistics

Galvão, A.B and Mitchell, J. and Runge, J. (2019) ‘Communicating Data Uncertainty: Experimental Evidence for U.K. GDP‘ ESCoE Discussion Paper Series ESCoE DP 2019-20, 5 December 2019

Ana Galvão ‘Communicating Data Uncertainty: Experimental Evidence for the UK GDP‘ ESCoE Research Seminar, 15 October 2019

Galvão, A.B and Mitchell, J. (2019) ‘Measuring Data Uncertainty: An Application using the Bank of England’s “Fan Charts” for Historical GDP GrowthESCoE Discussion Paper Series ESCoE DP 2019-08, 24 May 2019

van der Bles, A.M., Van der Linden, S., Freeman, A.L.J., Mitchell, J., Galvao, A.B., Zaval, L. and Spiegelhalter, D. (2019) “Communicating uncertainty about facts, numbers and science“, Royal Society Open Science, Vol 6, Issue 5, The Royal Society Publishing.

Ana Galvão ‘Data Uncertainty and Business Cycles: an Evaluation of the Bank of England’s Probabilistic Backcasts‘ ESCoE Research Seminar, 13 February 2018.



Related Publications

Related Events