Evaluating the risk of nonresponse bias in educational large-scale assessments with school nonresponse questionnaires

Periodical
Large-scale Assessments in Education
Volume
5
Year
2017
Relates to study/studies
TIMSS 2011
PIRLS 2011

Evaluating the risk of nonresponse bias in educational large-scale assessments with school nonresponse questionnaires

A theoretical study

Abstract

Survey participation rates can have a direct impact on the validity of the data collected since nonresponse always holds the risk of bias. Therefore, the International Association for the Evaluation of Educational Achievement (IEA) has set very high standards for minimum survey participation rates. Nonresponse in IEA studies varies between studies and cycles. School participation is at a higher risk relative to within-school participation; school students are more likely to cooperate than adults (i.e., university students or school teachers). Across all studies conducted by the IEA during the last decade, between 7 and 33% of participating countries failed to meet the minimum participation rates at the school level. Quantifying the bias introduced by nonresponse is practically impossible with the currently implemented design. During the last decade social researchers have introduced and developed the concept of nonresponse questionnaires. These are shortened instruments applied to nonrespondents, and aim to capture information that correlates with both: survey’s main outcome variable(s), and respondent’s propensity of participation. We suggest in this paper a method to develop such questionnaires for nonresponding schools in IEA studies. By these means, we investigated school characteristics that are associated with students’ average achievement scores using correlational and multivariate regression analysis in three recent IEA studies. We developed regression models that explain with only 11 school questionnaire variables or less up to 77% of the variance of the school mean achievement score. On average across all countries, the R 2 of these models was 0.24 (PIRLS), 0.34 (TIMSS, grade 4) and 0.36 (TIMSS grade 8), using 6–11 variables. We suggest that data from such questionnaires can help to evaluate bias risks in an effective way. Further, we argue that for countries with low participation rates a change in the approach of computing nonresponse adjustment factors to a system were school´s participation propensity determines the nonresponse adjustment factor should be considered.