ICILS 2018 Design

Quantitative Study
  • Overall approach to data collection
    • Proctored assessment and self-administered surveys
  • Specification
    • Cross-sectional
    • Some trend reporting possible with previous cycles
Target population


  • Students in Grade 8
  • Typically, around 14 years of age in most countries (in education systems where the average age in Grade 8 is below 13.5, Grade 9 is defined as the ICILS target population.)



  • Comprises all teachers teaching regular school subjects to the students in the target grade at each sampled school.
Sample design
Multistage (two stage) cluster sampling

First Stage (Schools)

  • Random selection of schools with a probability proportional to size as measured by the number of students enrolled in a school.
  • Each country was advised for a minimum sample size of 150 schools except some small education systems where all schools were included in the survey.


Second Stage (Students)

  • Twenty students were randomly sampled among all the enrolled students in the target grade.
  • All students were invited to participate for schools with fewer than 20 students in a targeted grade.


Second Stage (Teachers)

  • Fifteen teachers were selected randomly from all teachers teaching in the target grade at each sampled school.
  • In schools with fewer than 20 teachers, all teachers were invited to participate in the survey.
Sample size

Per Country

  • School Sample: 150-261 schools (minimum sample = 150, except for two smaller systems, for which the achieved samples were 35 and 110 schools, respectively)
  • Student Sample: 20 per school (minimum)
  • Teacher Sample: 15 per school (minimum)



  • Schools: around 2,200
  • Students: around 46,000
  • Teachers: around 26,000
Data collection techniques and instruments

Student CIL and CT assessments  

  • The tests were embedded within modules. In total, there were five 30 minutes CIL modules and two 25 minutes CT modules.
  • Each student completed 2 out of 5 CIL modules. The order and selection of the CIL modules were assigned randomly.
  • In countries participating in the CT option, each student completed the two 25-minute modules following completion of the CIL test and student questionnaire. The CT module order was balanced across students.


Student questionnaire

  • Following the CIL assessment, each student completed a 30-minute student questionnaire.
  • The student questionnaire was used to investigate student engagement with ICT and included questions related to experience and use of ICT, their attitudes towards the use of computers and ICT and background characteristics.


Questionnaires for teachers, ICT coordinators, and principals

  • ICILS 2018 also included a teacher questionnaire, which was completed by 15 randomly selected Grade 8 teachers in each sampled school.
  • The questions were related to familiarity of teachers with ICT, their use of ICT in educational activities, their perceptions of ICT in schools, learning to use ICT in teaching, and their background characteristics.
  • The ICT coordinators also had to complete a 15-minute questionnaire, which included questions related to the resources of ICT in school, ICT use in school, ICT technical support, and provisions for professional development in ICT.
  • The principal of respective schools completed a 15-minute questionnaire answering questions related to school characteristics, policies, procedures, and priorities for ICT.


National coordinator questionnaires

  • National research coordinators (NRCs) collected data from experts in a national context survey (NCS).
  • The survey was used for gathering information about the structure of the education system and systematic descriptions of policy and practice in the use of ICT in school education.
  • achievement test
  • questionnaire

Languages used

  • Administration of assessment instruments and questionnaires in 17 different languages
  • The most common languages
    • English (2 countries)
    • Spanish (2 countries)
    • German (2 countries)
    • Russian (2 countries)
Translation procedures
  • Development of an international version of all assessment instruments in English by the ICILS International Study Center
  • Translation into applicable languages of instruction by national research coordinators (NRCs)
  • Translation verification by linguistic and assessment experts in order to ensure equivalence with the international version
Quality control of operations

Measures during data collection

  • Participants were responsible for data collection within their own respective territories.
  • Standardized survey operation procedures: step-by-step documentation of all operational activities provided with the operation manuals
  • Full-scale field test of all instruments and operational procedures (in each participating country and entity)
  • Provision of software tools for supporting activities (e.g., sampling and tracking classes and students; administering school and teacher questionnaires; documenting scoring reliability; creating, and checking data files)
  • Training, e.g., for national research coordinators (NRCs) and their staff, for school coordinators, and test administrators
  • School visits conducted by international quality observers (IQOs) during test administration (at 15 schools per grade and country)
  • Training, e.g., for IQOs for procedures related to arranging visits to schools, interviewing school coordinator and test administrator, and reporting of these activities
  • Documentation of data collection activities by IQOs to determine whether the ICILS assessment was administered in compliance with the standardized procedures
  • National quality control program by national study centers that involved national quality officers (NQOs) to visit 10 percent of the sampled schools (minimum 15 schools) during testing
  • Survey activities questionnaire (SAQ) completed by NRCs


 Measures during data processing and cleaning

  • Testing of all data cleaning programs with simulated data sets
  • Registering all incoming data and documents in a specific database recording the date of arrival
  • All systematic data recoding were documented.
  • Iterative data cleaning process on each national dataset until all data were consistent and comparable
  • Standardized cleaning process
  • Repetition of data cleaning and comparison of the new data sets with the preceding version
  • Finally, identification of irregularities in data patterns and correction.