ICILS 2013 OUTCOME MEASURES

Assessment domain(s)
  • Computer literacy
  • Information literacy
Achievement and test scales
Scale Creation

Prior to scaling, an extensive analysis of scaling properties was carried out that included reviews of missing values, test coverage, assessment of item fit, differential item functioning by gender, and cross-national measurement equivalence.

The ICILS test items were scaled using item response modeling with the (one-parameter) Rasch model.

  • The CIL scale was derived from student responses to the 62 test questions and large tasks (which corresponded to a total of 81 score points).
  • Most questions and tasks corresponded to a single item each; however, each ICILS large task was scored against a set of criteria (each criterion with its own unique set of scores) relating to the properties of the task. Each large-task assessment criterion was therefore also an item in ICILS.

Additionally, plausible values as ability estimates were generated with full conditioning in order to take all student-level and between-school differences into account.

The final reporting scale was set to a metric with a mean of 500 (the ICILS average score) and a standard deviation of 100 for the equally weighted national samples.

Four proficiency levels were established, thereby providing test item locations on the CIL achievement scale and allowing a description of these levels complete with example test items.

 

List of Achievement Scales

The computer and information literacy scale

Questionnaire and background scales
Scale Creation

Two general types of indices could be distinguished, both of which derived from the ICILS questionnaires:

Simple indices

  • They were constructed through arithmetical transformation or simple recoding.
  • For example: ratios between ICT and students or an index of immigration background based on information about the country of birth of students and their parents.

Scale indices

  • They were derived from the scaling of items, a process typically achieved by using item response modeling of dichotomous or Likert-type items.
  • Item response modeling (applying the Rasch partial credit model) provided an adequate tool for deriving 10 international student questionnaire scales, nine teacher questionnaire scales, and seven school questionnaire scales.
  • A composite index reflecting socioeconomic background was derived using principal component analysis of three home background indicators, namely, parental occupation, parental education, and home literacy resources.
  • Generally, the scales used in ICILS had sound psychometric properties, such as high reliability.
  • Confirmatory factor analyses showed satisfactory model fit for the measurement models underpinning the scaling of the questionnaire data.

Only scale indices are reported below.

 

List of Background Scales

Student questionnaire

Students’ use of ICT applications

  • Students’ use of specific ICT applications
  • Students’ use of ICT for social communication
  • Students’ use of ICT for exchanging information
  • Students’ use of ICT for recreation

Students’ school-related ICT use

  • Students’ use of ICT for (school-related) study purposes
  • Students’ use of ICT during lessons at school
  • Students’ reports on learning ICT tasks at school

Students’ ICT self-efficacy, interest, and enjoyment

  • Students’ confidence (ICT self-efficacy) in solving basic computer-related tasks
  • Students’ confidence (ICT self-efficacy) in solving advanced computer-related tasks
  • Students’ interest and enjoyment in using computers and computing

 

Teacher questionnaire

Teachers’ confidence in computer tasks (self-efficacy)

Teachers’ use of ICT applications for teaching

Teachers’ use of ICT for activities and practices in class

  • Teachers’ use of ICT for learning at school
  • Teachers’ use of ICT for teaching at school

Teachers’ emphasis on ICT in teaching

Teachers’ views on using ICT for teaching and learning

  • Positive views on using ICT in teaching and learning
  • Negative views on using ICT in teaching and learning

Teachers’ views on the context for ICT use at their school

  • Teachers’ perspectives on the lack of computer resources at school
  • Teachers’ perspectives on collaboration between teachers in using ICT

 

School questionnaires

ICT coordinators’ reports on ICT resources at school

ICT coordinators’ perceptions of hindrances to ICT use at school

  • ICT use hindered in teaching and learning: lack of hardware
  • ICT use hindered in teaching and learning: other obstacles

School principals’ perceptions of the importance of ICT at school

  • Principals’ perceptions of using ICT for educational outcomes
  • Principals’ perceptions of the ICT use expected of teachers: learning

School principals’ views of ICT priorities at school

  • Principals’ views of priorities for facilitating use of ICT: hardware
  • Principals’ views of priorities for facilitating use of ICT: support