Non-European Student Migration to the UK

15th April 2015
Next update
30/06/2016
Press contact
Rob McNeil

This briefing answers key questions about how many students from outside Europe come to the UK, where they come from, their characteristics, who sponsors them, and how many eventually settle in the country.

Key points

  • The number of international students coming to the UK has fallen since 2010.
    More...
  • Seven out of ten international students came from non-EU countries in 2013. Students from Asia make up the largest group of international students, over five times as many as the next largest group from the Middle East.
    More...
  • Eight out of ten sponsorships for international students in 2013 were made by UK higher education institutions.
    More...
  • Data sources on the extent to which students remain in the UK after their studies point in different directions.
    More...
  • Students bring fewer dependents to the UK compared to other pathways such as labour migrants.
    More...

Understanding the evidence

Data on international students come from several different sources that use different definitions and that count students in different ways. For the purpose of estimating the contribution of students to overall counts of Long-Term International Migration (LTIM), the Official Office of National Statistics (ONS) makes estimates based on entrants’ expressed intent to stay for at least one year, as measured through the International Passenger Survey (IPS). By this measure, entrants to the UK who say they are here for formal study and who say that they plan to stay in the UK for at least 12 months are counted as migrants by the ONS.
Alternative sources of data on student migration come from administrative data on 'entry clearance visas' issued to students, and 'passenger entries' into the UK of international students. Passenger entry data is based on a projection from a sample of landing cards filled out at ports of entry; visa data are actual counts of all visas issued. These data sources do not exclude international students expected to stay for less than one year, and who are therefore not classified as migrants in the IPS/LTIM figures. They do, however, separately track 'student visitor' visas of eleven months or less; these are excluded from the analyses below. Administrative data do not cover European and British nationals, since visas and landing cards are required only of people subject to immigration control. (Nationalities subject to immigration control are non-EEA [European Economic Area] countries, with the exception of Switzerland, often shortened to 'non-EEA/Swiss nationals'.)
International students attend a wide range of institutions, including universities, colleges, language schools, vocational schools, English language schools, and independent fee-paying secondary schools. They range from secondary school-age children to postgraduate student researchers.
Note that all Home Office statistics in this briefing that are greater than 1,000 are rounded to the nearest 100. Figures in tables and charts are not rounded.

Back to top

International student arrivals falling in various measures

Data from each of the available sources are consistent in finding that student migration comprises a significant share of international migration to the UK, but has declined since 2010. The sources diverge as to the actual number of students arriving.

In 2013, for example, administrative data on passenger entries and visas different numbers of international students from outside Europe (199,600 visas issued, 184,500 passenger entries), while IPS estimates estimated 122,000 non-EU nationals coming for the purpose of formal study.

Adding British and other EU nationals (plus adjustments for changes to length-of-stay) yields an LTIM estimate of 171,000 international students in 2013. These differences might arise from counting students who stay for less than one year in administrative data, but it is impossible to know for sure from the presently available data. (For more on discrepancies between sources, see Evidence gaps and limitations below.) 

As Figure 1 shows, the trends over time also differ by data source. IPS estimates show student migration increasing since the early 1990s to 2010 then decreasing until 2013. Reliable visa data are available only from 2007, and show an increase from 2007-2009, decreases during 2010-2012, and a small increase in 2013. Trends in passenger entry data are not clearly interpretable for two reasons. First, student visitors were included in this series until mid-2007; second, in mid-2003, the Home Office improved its methods of determining these data from samples of landing cards, making pre-2004 data less comparable. It appears, then, that earlier data overestimated student entries. This means that there probably was an increase in passenger entries between 2000 and 2010 that is masked by these changes in data methods and presentation. 

Figure 1

Back to top

Seven out of ten international students came from non-EU countries in 2013, and 55% were female

According to IPS estimates, international students come primarily from non-EU countries. In 2013, about 70% of students came from non-EU countries (122,000 out of 171,000). Figure 2 shows how this proportion has changed over time.

Figure 2

Meanwhile, passenger entry data from the Home Office reveals more detail about where these students and their dependents come from. Figure 3 shows that in 2013 the greatest number of students and dependents came from Asia (about 120,000) followed by the Middle East (21,000). Significant recent changes include a sharp drop from North America (over 70,000 each year from 2005-2007, down to 19,400 in 2013) and a nearly 50% one-year increase in Asian students and dependents from 2008 (124,300) to 2009 (183,300) followed by a drop in 2013 to 120,000.

Figure 3

By gender, in 2013 the majority (55%) of student migrants coming to Britain from all nationalities were female. Figure 4 shows how these more recent estimates represent a substantial change from the 1999-2011 period where males made up a majority of student migrants.

Figure 4

More than 800 sponsor institutions lost their licenses between 2010 and 2014

From 2010 onwards, the UK government introduced a series of policies designed to address ‘abuse’ of the student visa route, particularly where non-EEA migrants might enter the UK under a formal study visa but actually worked instead. As part of these changes, all education sponsors were required to apply for ‘highly trusted’ status. This required them to meet a number of criteria, including having a low rate of student visa refusals and a high rate of course completion.

Between 1 May 2010 and 7 October 2014, 836 education providers lost their licences, preventing them from bringing non-EEA students to the UK. Interpreting this number is not entirely straightforward. First, some licenses were revoked because institutions did not apply for highly trusted status. This may be because institutions knew they did not meet the new criteria. It could also be for other reasons—for example, because they stopped operating or went bankrupt. Second, some providers on the list of organizations that had their license revoked during this period reapplied and had their license reinstated. Third, not all the colleges that lost their licenses will have closed. They can no longer sponsor non-EEA students but are not prevented from operating for domestic or EEA students.

Specific data on each of these categories of revocation are not available, but management data provided by the Home Office suggest that:

  • 223 licenses were revoked because the sponsor did not apply for highly trusted status by October 2011 (this does not include colleges that had not yet been licensed for at least 12 months at that point, and who therefore faced a later deadline).
  • A further 237 colleges either failed to meet a later deadline to apply for HTS, or applied and were refused. 

By March 2015, 70 colleges with previously revoked licenses were currently licensed again under the same name (this does not include any who may have applied under a new name).

Data on student visa applications include information on the type of educational institutions acting as sponsors. Figure 5 shows the number of visa applications made by student migrants using visa sponsorships. In 2014, the majority of international students (81%, or 168,600) were sponsored by UK-based higher education institutions. The remainder were sponsored by tertiary, further education or other colleges (9%), English language schools (2%), Independent schools (7%) and others (1%). It is important to note that not all students receiving visa sponsorship took up the offer and came to UK.

The decrease in visa sponsorships from 2010 to 2014 was driven by lower numbers sponsored by tertiary or further education colleges. Sponsorships at UK-based higher education institutions continued to rise during this period.

Figure 5

Back to top

Mixed data on student migrants adding to the permanent population

Student visas are temporary, which means they do not provide a direct legal route to settlement. Student migrants are thought to have shorter stays in the UK than other types of migrants. According to Home Office data, of those who entered in 2008 on a student visa (or as a student’s dependent), only 15% still had valid leave to remain in the UK five years later in 2013, and only 1% had received indefinite leave to remain (Home Office 2015a). This is consistent with results for previous cohorts (see Achato et al. 2010).

However, the first consecutive years’ of data collected from a new IPS question suggests that students may actually depart at a lower rate than Home Office research indicates. In 2012, the IPS began to ask departing emigrants why they had come to Britain in the first place (for those who were former immigrants rather than people who had always resided in Britain). 

The first set of results from this question in 2012 showed only 67,000 people who reported that they had originally immigrated to Britain to study had left. In 2013, this figure increased to 72,000 people. The estimates suggest a lower rate of exit for those who entered Britain for formal study than indicated by the Home Office research, which track legal leave to remain (and its expiration) rather than actual departures from Britain. The ONS notes that the limited time frame for this question means that there is not a clear picture of how student entries and exits interact over time. Moreover, it is important to be clear that the Home Office research noted above derives from data on visas and Home Office internal case management files, so it is not directly comparable to ONS data from the IPS. Nonetheless, the new data paint a different picture of student emigration than the visa data imply.

Those wishing to stay in the UK beyond the terms of their initial student visa have several options: they can apply for an extension to their student visa, or apply for a new visa under a work or family category. Some might stay on without a visa, but overstayers are difficult to estimate and impossible to count directly using administrative data, as they evade the immigration control system. The outcomes of extension applications made by non-EEA students appear in Figure 6. Refusals have been relatively rare, although they were higher during the initial stages of the Points-Based System. In 2014, about 85% of student extension applications (65,900) were accepted, while about 15% (11,100) were rejected. Note that these data refer to extensions granted to remain in the UK as students, and do not include students who extend their stay through transfer to work or family routes.

Figure 6

Students bring fewer dependents than other categories of migrants

The ratio of main applicants to dependents awarded Tier 4 Student visas is approximately 10:1, meaning that slightly more than one dependent visa is granted entry for every 10 main applicant student visas. For Tier 2 labour migrants the ratio historically has ranged between 10:6.5 and 10:8.

Back to top

Evidence gaps and limitations

The usual inconsistencies among different sources of data on UK migration flows apply to the estimation of student migration. The sources of data include Home Office and LTIM estimates from the International Passenger Survey, and changes in the population of foreign students in the UK estimated by the quarterly Labour Force Survey.

Each of these sources has important limitations as a measure of student migrants coming to the UK.

  • IPS is a survey of a randomly-selected sample of passengers entering and leaving the UK, rather than an actual count of migrants and other passengers. This has several implications.  First, the IPS yields only estimates of migration, and these estimates come with a substantial margin of error as well as possible unknown biases. Further, the IPS detects students through a single question asking passengers to report their primary reason for migration; thus it might undercount students if some give a different primary reason for coming to the UK, including work or accompanying family. The IPS includes EU as well as non-EU nationalities. The ONS publishes estimates of EU and non-EU  citizens  separately,  but these estimates are based only on IPS data and do not include the adjustments (e.g. for “switching” from short-term to longer-term stays in the UK) that the ONS makes to produce LTIM from IPS data. The estimates of student migration from IPS data are usually slightly lower than adjusted LTIM estimates.
  • Administrative data exclude EEA and Swiss citizen, and include individuals who are not long-term migrants by ONS definitions. Students staying between 6 and 12 months need student visas to enter but do not stay long enough to meet the ONS definition of a long-term migrant. Visa data also include some people who get visas but never come to the UK – a 2010 Home Office report on students found that, for a subset of educational institutions they examined, 20% of those offered admission and granted a CAS and/or visa did not enter the UK. The 20% figure is not reliable, as it was drawn from a “convenience sample” and may not be representative, but the conceptual point remains valid. Passenger entry data may double-count some arrivals, or overestimate student entries for some unknown reason. (Passenger entries of students consistently exceed visas, although the reverse should be true if some visa-grantees do not arrive and visas are required for entry.)
  • LFS data probably undercount students, especially those living in dormitories and other communal dwellings. Also, the LFS includes information on the number of international students in the UK at any one time, but has only began in 2010 to ask respondents whether they came to the UK for the purpose of study, so it has limited value for examining trends over time.

Estimating the contribution of students to net migration, as opposed to simply total inward migration, adds further problems. IPS currently provides the only estimates of outward migration. Until the most recent year’s data, however, IPS has not assessed the number of people studying in the UK and then emigrating, but rather has tracked the number who are leaving to study elsewhere. Thus, the positive “net migration” of students based on past years’ IPS data shows only that more people come to study in the UK than leave the UK to study. It has had no information about the outward flows of people who originally came to the UK as students.

The 2010 Home Office report on student migration, and follow-up studies since then, included an analysis of international students’ compliance. However, this analysis is based on a sample of institutions chosen not at random, but for “convenience”. More specifically, the study examined universities on the UK government’s Highly Trusted Sponsors list, while choosing other educational institutions from a list of those that had been subject to investigations because of suspicions about their legitimacy. Thus, one would suspect that this methodological choice would lead to underestimating non- compliance rates for university students while inflating non-compliance rates for other institutions. Moreover, the data do not show actual overstayers but only the “potentially non-compliant”—those for whom there is no record of leaving the UK or of valid extension of their stay. But since data on exits from the UK are among the weakest points in UK migration data, this study cannot be taken as an accurate measure of actual overstaying or “non-compliance”, either overall or for students at particular types of institutions. New data from the IPS, discussed above, may fill this gap in time.

Overstaying should not be confused with legitimate extensions to student visas, as sometimes occurs in media reports (as in the headline correction in McVeigh 2010) Legitimate extensions through work, family, or extended study are shown in administrative data and examined in the Migrant Journey research reports produced by the Home Office (Achato et al. 2010, Home Office 2013, Home Office 2015a).

Finally, there has been concern among politicians and the public about “bogus colleges” sponsoring entries but not actually offering courses of study (Anderson and Rogaly 2005), and about possible “non-compliance”— students overstaying their visas and staying in the UK without legal permission to remain. As is generally true for non-compliance with immigration law, it is difficult to capture these in official data because of the nature of non- compliance. The Home Office studied the issue of bogus colleges in 2007, prior to the PBS and the tightening of rules for institutions registering to teach foreign students. This study found that 25% of 1200 colleges inspected were not “genuine” (Home Affairs Committee 2009), but there are no comparable data on post-PBS accredited institutions. A subsequent Home Office study (Home Office 2010) may not provide valid estimates as it relied on samples of convenience rather than representative samples.

Back to top

References

Press contact

If you would like to make a press enquiry, please contact:
Rob McNeil
+ 44 (0)1865 274568
+ 44 (0)7500 970081
robert.mcneil@compas.ox.ac.uk