PISA-D Design

Methodology
Quantitative Study
Method(s)
  • Overall approach to data collection
    • Proctored assessment and self-administered questionnaire (in-school assessment)
    • Interviewer-administered questionnaire and proctored assessment in the household (out-of-school assessment)
  • Specification
    • One-off assessment with results linked to PISA 2015 (which is cross-sectional, every three years)
Target population

Core study (in-school)

15-year-old students enrolled in an education institution at Grade 7 or higher in their respective countries and economies, including those:

  • Enrolled full-time in an educational institution
  • Enrolled in an educational institution, but attending only on a part-time basis
  • Enrolled in a vocational training program, or any other related type of educational program
  • Attending a foreign school within the country (as well as students from other countries attending any of the programs in the first three categories)

 

Core study (out-of-school)

14-to-16-year-olds out-of-school or enrolled in an education institution at Grade 6 or below in their respective countries and economies, including those:

  • Never enrolled full-time in an educational institution
  • Previously enrolled in an educational institution, but who dropped out before completing primary education
  • Previously enrolled in educational institutions, but who dropped out after completing primary education.
  • Previously enrolled in educational institutions, but who dropped out during lower secondary education
  • Enrolled in Grade 7 or above, but not attending regularly
Sample design

In-school assessment

  • The international PISA-D target population in each participating country and economy consisted of 15-year-old students attending an educational institution in Grade 7 and higher. 
  • In all countries, the sampling design used for the PISA-D assessment was a two-stage stratified sample design.

 

Out-of-school assessment

  • The target population for PISA-D out-of-school assessment consists of 14-to-16-year-olds who are out of school, or in school but in grade 6 or below.
  • The core sample design consisted of two major strata defined as high and low concentration strata of the target population and two components—representative and limited representative— within each stratum.

 

IN-SCHOOL PART

The first-stage sampling units

  • Consisted of individual schools with 15-year-old students or the possibility of having such students at the time of assessment.
  • Schools were sampled systematically from a comprehensive national list of all PISA-D-eligible schools – the school sampling frame – with probabilities that were proportional to a measure of size.
  • The measure of size was a function of the estimated number of PISA-D-eligible 15-year-old students enrolled in the school. This is referred to as systematic-probability-proportional-to-size (PPS) sampling.
  • Prior to sampling, schools in the sampling frame were assigned to mutually exclusive groups based on school characteristics (explicit strata) specifically chosen to improve the precision of sample-based estimates.

 

The second-stage sampling units in countries using the two-stage design were students within sampled schools.

  • Once schools were selected to be in the sample, a complete list of each sampled school’s 15-year-old students was prepared.
  • For each country a target cluster size (TCS) was set, typically 42 students for computer-based countries and 35 for paper-based countries, although by agreement countries could use alternative values. The sample size within the schools is prescribed, within limits, in the PISA-D Technical Standards.
  • From each list of students that contained more than the TCS, a sample of typically 42 students was selected with equal probability.
  • For lists of fewer than the TCS, all students on the list were selected.

 

OUT-OF-SCHOOL PART

  • The PISA-D tests and contextual questionnaires for out-of-school youth were administered in five countries (Guatemala, Honduras, Panama, Paraguay, and Senegal).
  • About 90,000 households in the participating countries were screened to determine the eligibility of youth in those settings for the out-of-school assessment, yielding slightly fewer than 7,500 completed cases. This represents a hit rate (number of cases completed per number of dwelling units screened for eligibility) of around 8% in the probability-sample component.
  • Another 900 cases were completed by eligible youth who were identified using a methodology that was not based on probability sampling (no response rate was computed for this component).
  • The instruments and survey operations were modified where necessary for the Main Survey, based on the results of the field trial.
Sample size

IN-SCHOOL PART

Per country/economy

Random sample of a minimum of 150 schools 

In total

Approximately 37,000 students 

 

OUT-OF-SCHOOL PART

Per country/economy

Probability sample of households: Guatemala (20,621), Honduras (19,019), Panama (17,163), Paraguay (24,880), and Senegal (8,020)  

In total 

Approximately 7,200 respondents 

Data collection techniques and instruments

THE ASSESSMENT

In-school assessment

  • Paper-based tests were used, with assessments lasting a total of 2 hours for each student, in a range of countries and economies.
  • The three domains were treated equally.
  • Test items were a mixture of multiple-choice questions and questions requiring students to construct their own responses.
  • The items were organized in groups based on a passage setting out a real-life situation.
  • About 390 minutes of test items were covered, with different students taking different combinations of test items.

 

Out-of-school assessment

  • Tablet computer-based tests were used, with assessments lasting a total of 50 minutes for each respondent, in a range of countries and economies.
  • The test covered two domains only (Reading and Mathematics) and these were treated equally.
  • Test items were multiple-choice questions.
  • The items were organized in groups based on a passage setting out a real-life situation.
  • Cognitive instruments included a 10-minute Core Module of basic reading and mathematics skills to ensure that respondents had an appropriate level of skills to proceed to the full assessment.
    • An established minimum number of items answered correctly determined the next set of items to be presented to respondents in the second stage of the cognitive assessment.
    • The second stage was designed to take no longer than 35 minutes to complete.
  • Respondents who
    • Passed the Core Module were randomly assigned to one of the 12 forms measuring Reading and Mathematical Literacy.
    • Failed the Core Module were directed to an assessment comprised of all Reading Components items. This was expected to take no longer than 15 minutes.

 

BACKGROUND QUESTIONNAIRES 

In-school questionnaires

  • A student background questionnaire for gathering information about the students themselves, their homes, the school, and their learning experiences that took 35 minutes to complete
  • A school questionnaire completed by principals that covered the school system and the learning environment
  • A teacher questionnaire that covered factors related to their own background, their teaching, students, and the school

 

Out-of-school questionnaires

  • An in-person interview with the youth
    • To gather information about the youths, their homes, and their school and learning experiences in cases where they had these.
    • Completion time 30–35 minutes
  • A questionnaire for the person most knowledgeable about the child, to provide information about the out-of-school youth’s prenatal and early childhood living conditions, parental support for schooling and learning, and their attitudes towards school and learning.
  • A household observational questionnaire (completed by the interviewer) with questions about the youth’s housing and neighborhood

 

PISA LINKAGE

In the future and starting with PISA 2022, instruments, approaches, and methodologies piloted in PISA-D will be integrated in main PISA.

Techniques
  • achievement or student test
  • questionnaire
Languages
  • The achievement test was administered in 5 different languages:
    • Spanish (5 countries)
    • English (1 country + 1 field trial)
    • French (1 country)
    • Khmer (1 country)
    • Wolof (1 country)
Translation procedures
  • Optimization of the English-language source version for translation through translatability assessment
  • Development of two source versions of the instruments, one in English and one in French
  • Double translation design
  • Preparation of detailed instructions for localization of the instruments for the field trial and for their review for the main survey
  • Preparation of translation and adaptation guidelines
  • Training of national staff in charge of translation and/or adaptation of the instruments
  • Validation of the translated and adapted national versions: verification by independent verifiers; review by the PISA-D Consortium and the translation referee or the questionnaires team
Quality control of operations

Measures during data collection

  • Procedures for sampling, translation, survey administration, and data processing were developed in accordance with the PISA-D Technical Standards for the in-school and out-of-school assessment and fully documented.
  • The PISA-D Consortium provided comprehensive manuals that explained the implementation of the survey:
    • They included:
      • Precise instructions for the work of school coordinators
      • Scripts for test administrators to use during the assessment sessions
      • Manuals for the interviewers
    • Proposed adaptations to survey procedures or modifications to the assessment session script were submitted to the PISA-D Consortium for approval prior to verification.
  • Participating countries and economies were required to ensure that test administrators worked with the school coordinator to prepare the assessment session, including:
    • Reviewing and updating the Student Tracking form.
    • Completing the Session Attendance form, which is designed to record students’ attendance and instruments allocation.
    • Completing the Session Report form, which is designed to summarize session times, any disturbance to the session, etc.
    • Ensuring that the number of test booklets and questionnaires collected from students tallied with the number sent to the school (paper‑based assessment countries) or ensuring that the number of USB sticks used for the assessment were accounted for (computer-based assessment countries).
    • Sending the school questionnaire, student questionnaires, parent and teacher questionnaires (if applicable), and all test materials (both completed and not completed) to the national center after the testing.
  • The PISA-D Consortium responsible for overseeing survey operations implemented all phases of the PISA-D Quality Monitor (PQM) process.
    • Phases included:
      • Interviewing and hiring PQM candidates in each of the countries
      • Organizing their training
      • Selecting the schools to visit
      • Collecting information from the PQM visits
    • PQMs
      • Independent contractors located in participating countries who were hired by the international survey operations contractor.
      • They visited a sample of schools to observe test administration and to record the implementation of the documented field-operations procedures in the main survey.
    • Typically, three or four PQMs were hired for each country, and they visited an average of 15 schools in each country. If there were adjudicated regions in a country, it was usually necessary to hire additional PQMs, as a minimum of five schools were observed in adjudicated regions.

 

Measures during data processing and cleaning

  • All quality-assurance data collected throughout the PISA-D assessment were entered and collated in a central data adjudication database on the quality of field operations, printing, translation, school and student sampling, and coding. 
  • Comprehensive reports were then generated for the PISA Adjudication Group.
    • The Technical Advisory Group (TAG) and the sampling referee reviewed this database to make country-by-country evaluations on the quality of field operations, printing, translation, school and student sampling, and coding.
    • The final report by TAG experts was then used for the purpose of data adjudication.
  • With the delivery of the out-of-school data from the National Centre, the Data Management contractor reviewed a detailed document of information that included any implemented or required confidentiality practices in order to evaluate the impact on the data management cleaning and analysis processes.