ICILS 2018 Design

Methodology
  • International large-scale sample survey of students, teachers, and school principals
  • Student achievement test allowing monitoring of trends across successive cycles
  • Predominantly quantitative, with qualitative information presented in descriptive country chapters
Method(s)
ICILS 2018
  • Overall approach to data collection
    • Proctored assessment of student achievement
    • Self-administered surveys for students, teachers, and school principals
  • Specification
    • Cross-sectional
    • Some trend reporting possible with previous cycles
ICILS Teacher Panel 2020
  • Overall approach to data collection: Self-administered surveys
  • Specification: Longitudinal
Target population
ICILS 2018
  • Students
    • The student target population consisted of all students enrolled in the grade that represented eight years of schooling, counting from the first year of ISCED Level 1, provided that the mean age at the time of testing was at least 13.5 years.
    • Students older than 17 years were not included in the target population.
    • In countries where the average age of students in Grade 8 was less than 13.5 years, Grade 9 was defined as the target population.
  • Teachers: All teachers teaching regular school subjects to students of the target grade (regardless of the subject or the number of hours taught) during the ICILS testing period who had been employed at the school since the beginning of the school year.
  • Schools
    • The population for the ICILS school survey comprised schools at which target grade students were enrolled.
    • Principals of sampled schools were asked to complete the school questionnaire.

 

ICILS Teacher Panel 2020
  • Students: The panel study did not survey students.
  • Teachers: All teachers eligible for ICILS 2018 except those who left the profession.
  • Schools: All schools eligible for ICILS 2018 that were still operating in 2020.
  • The panel study also administered separate questionnaires to principals and designated ICT coordinators in each school.
Sample design
ICILS 2018: Stratified two-stage cluster sampling design - optimized for the student population

Schools selected at the first stage for the student population were also considered sampled for the teacher and the school population.

 

First stage: sampling schools

  • Selection probability proportional to the size of the school (PPS)
  • Optional: stratification of schools according to (demographic) variables of interest (e.g., school type or source of funding, level of urbanization, region of the country), either explicit or implicit
  • Random-start fixed-interval systematic sampling
  • Schools sampled at the same time for field trial and main data collection
  • For each sampled school, two replacement schools were assigned where possible

 

Second stage: sampling students

Within schools agreeing to participate:

  • Systematic random sampling was used to select 20 students from the sampled school in the target grade.
  • All students were invited to participate for schools with fewer than 20 students in the target grade.
  • Each student had an equal selection probability within the school.

 

Second stage: sampling teachers

Within schools agreeing to participate, 15–20 teachers were selected from all eligible teachers using systematic random sampling.

 

ICILS Teacher Panel 2020

The Teacher Panel was based on the existing random samples of schools and teachers from the 2018 ICILS survey in the three participating countries:

  • As far as teachers were concerned, the goal was to identify the exact same individuals
  • In contrast, for principals and ICT coordinators, the questionnaires focused on school‐level details, so that finding the same individuals from the 2018 study was not necessary.

However, the data are subject to a certain degree of uncertainty because

  • Not all schools and teachers responding in 2020 participated in the 2018 survey.
  • Not all schools and teachers responding in 2018 participated in the follow-up survey in 2020.

 

General notes

  • Sampling of schools was conducted by the sampling team at IEA Hamburg.
  • Sampling procedures within the center/school were carried out by the national study centers, using the Within-school Sampling Software for Windows (WinW3S) provided by IEA.
Sample size
Intended per country
  • Approx. 3,000 assessed students
  • Approx. 2,600 participating teachers
  • Minimum of 150 schools (in countries with fewer than 150 schools, all available schools were included.)
  • In each sampled school, selection of 20 students (or all if the number of target grade students was less than or equal to 25) and 15 target grade teachers (or all if the number of target grade teachers was less than or equal to 20)
  • Required effective sample size: minimum of 400 students

 

Total achieved

ICILS 2018

Approximately 46,000 students, 26,000 teachers, and 2,200 schools (i.e., principals and ICT coordinators) from 14 education systems (12 countries and two benchmarking entities) participated.

ICILS Teacher Panel 2020

Approximately 2,150 teachers and data from 260 schools (i.e., principals and ICT coordinators) in three countries participating in both study years (2018 and 2020)

Data collection techniques and instruments
ICILS 2018

Student CIL and CT assessments  

  • The tests were embedded within modules. In total, there were five 30 minutes CIL modules and two 25 minutes CT modules.
  • Each student completed 2 out of 5 CIL modules. The order and selection of the CIL modules were assigned randomly.
  • In countries participating in the CT option, each student completed the two 25-minute modules following completion of the CIL test and student questionnaire. The CT module order was balanced across students.

 

Student questionnaire

  • Following the CIL assessment, each student completed a 30-minute student questionnaire.
  • The student questionnaire was used to investigate student engagement with ICT and included questions related to experience and use of ICT, their attitudes towards the use of computers and ICT and background characteristics.

 

Questionnaires for teachers, ICT coordinators, and principals

  • ICILS 2018 also included a teacher questionnaire, which was completed by 15 randomly selected Grade 8 teachers in each sampled school.
  • The questions were related to familiarity of teachers with ICT, their use of ICT in educational activities, their perceptions of ICT in schools, learning to use ICT in teaching, and their background characteristics.
  • The ICT coordinators also had to complete a 15-minute questionnaire, which included questions related to the resources of ICT in school, ICT use in school, ICT technical support, and provisions for professional development in ICT.
  • The principal of respective schools completed a 15-minute questionnaire answering questions related to school characteristics, policies, procedures, and priorities for ICT.

 

National coordinator questionnaires

  • National research coordinators (NRCs) collected data from experts in a national context survey (NCS).
  • The survey was used for gathering information about the structure of the education system and systematic descriptions of policy and practice in the use of ICT in school education.

 

ICILS Teacher Panel 2020

Teacher questionnaire

  • Expected to take 40 minutes to complete,
  • Compared to 30 minutes for the ICILS 2018 study.

 

School principal questionnaire

  • Expected to take 20 minutes,
  • Compared to 15 minutes in the ICILS 2018 study.

 

ICT coordinator questionnaire

  • Expected to take 18 minutes,
  • Compared to 15 minutes for the 2018 study.

 

All questionnaires were administered online, using the same version of the IEA Online Survey System (OSS) as in ICILS 2018.

Languages
ICILS 2018
  • Administration of assessment instruments and questionnaires in 17 different languages
  • The most common languages
    • English (2 countries)
    • Spanish (2 countries)
    • German (2 countries)
    • Russian (2 countries)

 

ICILS Teacher Panel 2020

Administration of questionnaires in 4 different languages:

  • Danish (in Denmark)
  • Finnish and Swedish (in Finland)
  • Spanish (in Uruguay)
Translation procedures
  • Development of an international version of all assessment instruments in English by the ICILS International Study Center
  • Translation into applicable languages of instruction by national research coordinators (NRCs)
  • Translation verification by linguistic and assessment experts in order to ensure equivalence with the international version
Quality control of operations

Measures during data collection

  • Participants were responsible for data collection within their own respective territories.
  • Standardized survey operation procedures: step-by-step documentation of all operational activities provided with the operation manuals
  • Full-scale field test of all instruments and operational procedures (in each participating country and entity)
  • Provision of software tools for supporting activities (e.g., sampling and tracking classes and students; administering school and teacher questionnaires; documenting scoring reliability; creating, and checking data files)
  • Training, e.g., for national research coordinators (NRCs) and their staff, for school coordinators, and test administrators
  • School visits conducted by international quality observers (IQOs) during test administration (at 15 schools per grade and country)
  • Training, e.g., for IQOs for procedures related to arranging visits to schools, interviewing school coordinator and test administrator, and reporting of these activities
  • Documentation of data collection activities by IQOs to determine whether the ICILS assessment was administered in compliance with the standardized procedures
  • National quality control program by national study centers that involved national quality officers (NQOs) to visit 10 percent of the sampled schools (minimum 15 schools) during testing
  • Survey activities questionnaire (SAQ) completed by NRCs

 

 Measures during data processing and cleaning

  • Testing of all data cleaning programs with simulated data sets
  • Registering all incoming data and documents in a specific database recording the date of arrival
  • All systematic data recoding were documented.
  • Iterative data cleaning process on each national dataset until all data were consistent and comparable
  • Standardized cleaning process
  • Repetition of data cleaning and comparison of the new data sets with the preceding version
  • Finally, identification of irregularities in data patterns and correction.