TIMSS 2019 Design

Methodology
  • International large-scale sample survey of students’ mathematics and science achievement, and of their educational contexts – community, home, school, classroom, and student
  • Monitoring of trends since 1995 by reporting results from successive cycles on a common achievement scale
  • Predominantly quantitative, with qualitative information presented in descriptive country chapters in the TIMSS 2019 encyclopedia
Method(s)

Overall data collection approach:

  • Proctored assessment of student achievement
  • Self-administered surveys for students, parents, teachers, and school principals
Target population
  • Fourth grade: All students enrolled in the grade that represents four years of schooling counting from the first year of ISCED Level 1, provided that the mean age at the time of testing is at least 9.5 years.
  • Eighth grade: All students enrolled in the grade that represents eight years of schooling counting from the first year of ISCED Level 1, provided that the mean age at the time of testing is at least 13.5 years.
Sample design
Stratified two-stage cluster sample design

First stage: Sampling schools

  • Probability proportional to the size of the school (PPS)
  • Stratifying schools (optional):
    • According to important (demographic) variables (e.g., region of the country, school type or source of funding, level of urbanization)
    • Can take two forms: explicit or implicit stratification
  • Random-start fixed-interval systematic sampling
  • Schools sampled at the same time for field test and main data collection
  • Sampling of two replacement schools for each school sampled (main data collection only)

 

Second stage: Sampling classes within schools

  • One or more intact classes from the target grade of each school selected using systematic random sampling
  • Probability inversely proportional to school size
  • Only after sampled schools have agreed to participate in the study

 

For eTIMSS countries: Bridge sample

  • Countries participating in eTIMSS were required to also select an equivalent, smaller bridge sample by sampling one additional class from a subset of the sampled schools, by selecting a distinct sample of schools, or by a combination of both strategies.
Sample size

TIMSS

  • Per country and target grade
    • School sample: minimum 150 schools
    • Student sample: about 4,000 students
  • Across all countries and both grades, approximately 580,000 students (330,00 students in fourth grade and 250,000 in eighth grade)

 

eTIMSS

  • Countries transitioning to eTIMSS were required to assess an additional sample of at least 1,500 tested students for the bridge data collection.
Data collection techniques and instruments

Student assessments in mathematics and science

  • Formats
    • Written format (paperTIMSS) and digital format (eTIMSS)
    • Countries participating in eTIMSS could choose from the following delivery methods: USB delivery, tablet delivery, local server method
  • Types of items
    • paperTIMSS and eTIMSS
      • Single and multiple selection of responses
      • Constructed responses (i.e., writing or typing, drawing, dragging and dropping in eTIMSS, matching items in paperTIMSS)
    • eTIMSS (additionally): PSI, i.e., interactive and situational simulation tasks
  • Item organized in item blocks and then into achievement booklets (for paperTIMSS) and item block combinations or digital “booklets” (for eTIMSS)
    • Booklets
      • A total of 14 test booklets per grade
      • Four blocks (2 mathematics, 2 science, in counterbalanced order) per booklet and each item block appearing in two booklets
    • Blocks
      • A total of 28 item blocks per grade (14 with mathematics items, 14 with science items)
      • Items per block
        • Grade 4: 10–14 items in each block
        • Grade 8: 12–18 items in each block 
      • eTIMSS contained 4 extra blocks to accommodate PSI tasks
    • Items per assessment in Grade 4
      • paperTIMSS and eTIMSS: 175 items in each of mathematics and science
      • Less difficult mathematics: 179 mathematics items (including less difficult and regular items)
      • PSI: 39 mathematics items (from three PSIs) and 19 science items (from two PSIs)
      • Bridge booklets (trend items only): 92 mathematics items and 111 science items
    • Items per assessment in Grade 8
      • paperTIMSS and eTIMSS: 211 items in mathematics and 220 in science
      • PSI: 25 mathematics items (from three PSIs) and 29 science items (from two PSIs)
      • Bridge booklets (trend items only): 117 mathematics and 122 science items
    • As far as possible, the distribution of items across content and cognitive domains within each block matches the distribution across the item pool overall.
    • Matrix sampling of items (rotated test booklet design)

 

  • Linking mechanisms
    • Between cycles
      • New items in 2019: 12 blocks (6 mathematics and 6 science)
      • The other 16 blocks (8 mathematics and 8 science) are (unreleased) trend items; for each subject, all 8 were administered in 2015 and 3 of them also in 2011.
    • Between regular and less difficult mathematics:  In the less difficult version, one third of the items were from the regular assessment and two-thirds were less difficult items.
    • Between paperTIMSS and eTIMSS: Bridge (in order to control for mode effects):
      • eTIMSS countries administered the complete computer-based eTIMSS 2019 assessment as well as a smaller, paper-based version of the trend items.
      • That is, eTIMSS countries re-administered their eight blocks of trend items from 2015 in paperTIMSS format.
Techniques
  • achievement test
  • questionnaire
Languages
  • Assessment instruments administered in 50 languages
  • Instruments administered in two or more languages in 31 countries and four benchmarking entities 
  • The most common languages:
    • English (24 countries)
    • Arabic (10 countries)
Translation procedures
  • International version of all assessment instruments are developed in English by TIMSS & PIRLS International Study Center.
  • Instruments are then translated by participating countries into their languages of instruction.
  • Translations are subsequently verified by independent linguistic and assessment experts in order to ensure equivalence with the international version.
  • Following translation verification, all draft instruments undergo layout verification by the TIMSS & PIRLS International Study Center to ensure they are internationally comparable and of high quality.
  • For the eTIMSS achievement materials, the procedures for translation and verification took place in the eTIMSS Online Translation system, part of IEA Hamburg’s eAssessment system.
Quality control of operations

Measures during data collection

  • NRC in each participating country responsible for data collection
  • Standardized survey operations procedures: step-by-step documentation of all operational activities provided with manuals, including eTIMSS versions of manuals
  • Full-scale field test of all instruments and operational procedures, including eTIMSS instruments and procedures (in each participating country and benchmarking entity)
  • Provision of software tools for supporting activities (e.g., sampling and tracking classes and students, administering school, teacher and home questionnaires online, documenting scoring reliability, creating and checking data files)
  • Training of NRCs and their staff, school coordinators, test administrators, etc.
  • School visits conducted by international quality control monitors (IQCMs) during test administration (15 schools per grade and country, 5 additional schools in benchmarking entities and 3 additional schools for paper bridge booklet administration in eTIMSS countries)
  • National quality control program
  • Survey activities questionnaire (SAQ) to be completed by NRCs

 

Measures during data processing and cleaning

  • Testing of all data cleaning programs with simulated data sets
  • Data and documents receipt database maintained by IEA Hamburg
  • Standardized cleaning process and documentation of any necessary data recoding
  • National adaptation database
  • Repetition of data cleaning and comparison of new data sets with preceding versions
  • Identification of irregularities in data patterns and correction of data errors