Documentation of Eurostudent VI
Eurostudent VI – a large online survey
Statistics Norway conducted Eurostudent VI in 2016. The online survey collected data on living conditions from a sample of 24 000 students in Norway, and yielded 8 200 approved responses.
Eurostudent surveys the social dimension of European higher education, with a total of 28 countries participating in 2016. This was the third time Statistics Norway collected data for the survey in Norway. The sample was increased this time in order to meet the need for more data. Eurostudent VI therefore used a sample that was four times the size of the sample three years earlier. The sample was randomly drawn.
Change in target group in Eurostudent
In 2016, the target group was changed. The goal of Eurostudent is to obtain statistics on higher education students in Norway who are in educational programmes lower than PhD level and higher than a two-year vocational programme. New to the survey in 2016 was the requirement to only include students that met their lecturers face-to-face. Unlike earlier rounds, distance learning students were considered outside the target group.
More than 8 200 students in the target group participated, and the response rate was 37 per cent. Nearly 1 500 persons were considered outside the target group, about half of whom were distance learning students. The results collected are presented in table 1.
|Table 1. Data collection result for Eurostudent VI
|Outside target group
|Method of collection
|April 18. - May 9. 2016
Comparison with Eurostudent IV and V
Evaluation of Eurostudent VI requires comparisons with previous studies. In 2010, 37 per cent of the 6 500 sample participated in Eurostudent IV, and in 2013, 43 per cent of the 8 000 sample participated in Eurostudent V. In 2013, letters were sent out and some data was collected via printed questionnaires. In 2016, the sample was increased to 24 000 and the entire survey was conducted digitally. The response rate was lower in 2016 compared with 2013, but was similar to the response rate in 2010. The wish for a larger number of participants in Eurostudent VI was fulfilled, and the number of responses in 2016 was larger than the entire sample three years earlier.
Figure 1. Number of respondents and sample size for Eurostudent, by year
Responses to the first question in the survey can be used to indicate contact with the sample. More than 13 000 people answered the first question, i.e. of everyone in the sample more than half of them began to fill out the survey. Of everyone we had contact with online, more than 1 300 chose to get the questions in English.
Before conducting the survey, we contacted the educational institutions and asked them to encourage their students to participate. A total of five text messages and five emails were sent during the first five days; one message of each type every day. Two emails were then sent the third week. Messages were sent out to those who had not responded at all and to those who had not completed the survey.
1 Number of web contacts with everyone in the sample compared to the number for completed forms from students in the target group, by field days (Monday on April 18th is day number one).
Figure 2. Web contact and completed forms in target group, by survey field days¹
The data from the students in the survey will be used to form a picture of the entire student population. In order to ensure that the sample is representative, the proportion of responses needs to be the same in the different groups. For example, the share of men and women selected for participation should be similar to the share that responds.
The sample was made up of 40 per cent men and 60 per cent women. Of all completed forms, 38 per cent were from men and 62 per cent were from women. This is very similar to the distribution in the sample and the result is considered acceptable.
Different types of errors in the survey
Technical obstacles in the survey
An online form uses technology that cannot necessarily be used by everyone. It might not work on the person’s mobile phone or computer, and this could introduce bias in relation to who is responding to the survey.
Figure 2 shows that many of those who started filling out the form didn’t finish. Therefore, the degree of online contact does not reflect the actual number of respondents. This is the situation in the entire collection period. It is reasonable to assume that this is partly due to technical obstacles.
A total of 3 000 fewer responded to question five than question four. The preceding questions are all easy to answer. The high number of respondents who stopped completing the survey near the beginning would suggest that technical problems prevented many from continuing. Indeed, several participants contacted us about such problems.
Some questions are difficult to answer, particularly in relation to facts that are not easy to recall or be certain about, such as financial information. Not everyone remembers how much they have spent and some contacted us to explain that it was difficult to put numbers on costs associated with studying. Questions about parents’ education were considered difficult to answer since it was often given a different name in the categories available.
Non-response and incomplete surveys can make the survey less representative. In order to compensate for this the survey is weighted, and the weights can be used in combination with the data to correct the non-response in some groups.
The wish to have a larger dataset on students was fulfilled in Eurostudent VI. Conducting the survey digitally reduces the data collection time and enables large numbers to participate simultaneously.