Do numbers matter in politics? The notorious “£350 million a week for the NHS” certainly played a vital role in the Brexit campaign. But so – according to Dominic Cummings, who directed the Vote Leave campaign – did 333 thousand. That was the record figure for net migration to the UK, published by the Office for National Statistics just a month before the vote.
Unlike the £350 million, the 333,000 is an official statistic, not a political construct. But that doesn’t mean it’s right. Indeed there have long been questions about the reliability of the International Passenger Survey, the principle source of the data for migration statistics. And those doubts are growing. The latest statistics should have been published a fortnight ago. Instead, they’ve been postponed until July – according to ONS, there are “quality assurance issues.”
As part of the Economic Statistics Centre of Excellence (ESCoE) work programme, I and my colleagues Jonathan Wadsworth, Augustin de Coulon, Andrew Aitken, and Michael O’Connor are undertaking a project examining the key migration statistics used in the UK, and in particular how they might be supplemented or improved with administrative data; something we know ONS is very keen to progress.
But, while we’ll have to wait for wholesale improvements to the migration statistics, we can still triangulate with what we have now. The ONS did publish estimates of the UK population, based on the Annual Population Survey, broken down by country of birth and nationality, for 2017. And that means we can compare what the population estimates tell us about who’s actually living in the country, with what the migration statistics tell us about people coming and going. And if we do this, we get some very odd results.
According to the population estimates, the number of people born elsewhere in the EU and now resident in the UK increased by nearly 1.5 million over the eight years to mid-2017. That suggests that net migration of EU citizens to the UK over that period should have been a similar number. However, the migration statistics show cumulative net EU migration of only about 950,000.
A very similar discrepancy – but in the opposite direction – applies to non-EU nationals. The IPS suggests total net migration of nearly 1.5 million – but the population estimates say that the number of people born outside the EU and resident in the UK has increased by less than 900,000. On the face of it, half a million people from outside the EU have vanished – but they’ve been replaced by about the same number of Europeans who we didn’t know we had.
What should we make of this? You wouldn’t expect the two series to match precisely. People die, for a start, although not nearly enough to account for the shortfall in non-EU born in the population estimates. The IPS counts people by citizenship, not country of birth, so maybe some emigrants are foreign-born Brits. The population estimates exclude people who live in “communal establishments”, and therefore miss out some students who may have been counted in the immigration statistics, and so on. And there’s sampling error in both surveys. The ONS has produced a useful guide to the differences. But, while these factors can and do account for significant divergence between the two series in the short term, in my view they can’t possibly add up to such large cumulative differences over such a long period.
More likely is that one, or both, surveys is badly wrong. The obvious candidate is the IPS. It has long been obvious that it has been overestimating student net migration from outside the EU; it now seems likely that this has translated into an overestimate of non-EU migration overall. Meanwhile, it undercounted EU migration in the 2000s, and I was arguing two years ago that the volume of National Insurance numbers being issued to EU citizens suggested their numbers were still being undercounted; this data seems to bear this out. Almost certainly, some EU citizens who arrive here thinking they’re only going to stay for a short time – and hence don’t tell the survey that they’re immigrants – end up staying. Nor is the APS immune to error – the number of people who are prepared to take the time to answer a very long list of questions has been shrinking, and maybe recent migrants are even more reluctant.
So what does this tell us about immigration and immigration policy? We can’t conclude from this that the migration numbers overall are incorrect – the total migration figures from each source roughly correspond. And while it seems reasonable to conclude that the migration statistics are out to a significant degree, that doesn’t mean the population estimates are entirely accurate: their accuracy may have decreased over time. But it does seem reasonably clear that in the recent past EU migration has been significantly higher, and non-EU migration significantly lower, than we thought. Next month the ONS will – alongside the delayed stats – publish a more detailed analysis, so we may learn more. And I hope that over time the use of administrative data, which our ESCoE project will explore in parallel with ONS’ own work, will improve matters still further.
But this analysis, broad-brush as it is, may mean that the implications of Brexit for UK migration are even greater than previously believed. It suggests that, perhaps as a consequence of the government’s determination to reduce non-EU migration, as manifested in measures ranging from the Tier 2 visa cap to the “hostile environment”, we may have become even more dependent on EU migration. The recent fall in migration, already clear in the data, may have been underestimated. And the consequences for the UK economy, first of the reduced attractiveness of the UK as a result of the Brexit vote, and over the longer term of the ending of free movement, may be even bigger. All the more reason to press the reset button on migration policy.
Jonathan Portes is Professor of Economics and Public Policy, King’s College London and an ESCoE Researcher
Friday, June 22, 2018