Cross-cultural comparability of noncognitive constructs in TIMSS and PISA

Jia He*, Fabián Barrera-pedemonte, Janine Buchholz

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

1 Downloads (Pure)

Abstract

Noncognitive assessments in Programme for International Student Assessment (PISA) and Trends in International Mathematics and Science Study share certain similarities and provide complementary information, yet their comparability is seldom checked and convergence not sought. We made use of student self-report data of Instrumental Motivation, Enjoyment of Science and Sense of Belonging to School targeted in both surveys in 29 overlapping countries to (1) demonstrate levels of measurement comparability, (2) check convergence of different scaling methods within survey and (3) check convergence of these constructs with student achievement across surveys. We found that the three scales in either survey (except Sense of Belonging to School in PISA) reached at least metric invariance. The scale scores from the multigroup confirmatory factor analysis and the item response theory analysis were highly correlated, pointing to robustness of scaling methods. The correlations between each construct and achievement was generally positive within each culture in each survey, and the correlational pattern was similar across surveys (except for Sense of Belonging), indicating certain convergence in the cross-survey validation. We stress the importance of checking measurement invariance before making comparative inferences, and we discuss implications on the quality and relevance of these constructs in understating learning.
Original languageEnglish
Pages (from-to)369-385
JournalAssessment in Education: Principles, Policy & Practice
Volume26
Issue number4
DOIs
Publication statusPublished - 2019

Fingerprint

PISA study
scaling
mathematics studies
science studies
school
factor analysis
student
trend
science
learning

Keywords

  • CONTEXT
  • MATHEMATICS ACHIEVEMENT
  • MEASUREMENT INVARIANCE
  • MOTIVATION
  • PISA
  • SCHOOL
  • SCIENCE ACHIEVEMENT
  • STUDENTS
  • TIMSS
  • cross-survey validation
  • measurement invariance

Cite this

He, Jia ; Barrera-pedemonte, Fabián ; Buchholz, Janine. / Cross-cultural comparability of noncognitive constructs in TIMSS and PISA. In: Assessment in Education: Principles, Policy & Practice. 2019 ; Vol. 26, No. 4. pp. 369-385.
@article{e49b9446bfec4429bddeaef2a23e4ca2,
title = "Cross-cultural comparability of noncognitive constructs in TIMSS and PISA",
abstract = "Noncognitive assessments in Programme for International Student Assessment (PISA) and Trends in International Mathematics and Science Study share certain similarities and provide complementary information, yet their comparability is seldom checked and convergence not sought. We made use of student self-report data of Instrumental Motivation, Enjoyment of Science and Sense of Belonging to School targeted in both surveys in 29 overlapping countries to (1) demonstrate levels of measurement comparability, (2) check convergence of different scaling methods within survey and (3) check convergence of these constructs with student achievement across surveys. We found that the three scales in either survey (except Sense of Belonging to School in PISA) reached at least metric invariance. The scale scores from the multigroup confirmatory factor analysis and the item response theory analysis were highly correlated, pointing to robustness of scaling methods. The correlations between each construct and achievement was generally positive within each culture in each survey, and the correlational pattern was similar across surveys (except for Sense of Belonging), indicating certain convergence in the cross-survey validation. We stress the importance of checking measurement invariance before making comparative inferences, and we discuss implications on the quality and relevance of these constructs in understating learning.",
keywords = "CONTEXT, MATHEMATICS ACHIEVEMENT, MEASUREMENT INVARIANCE, MOTIVATION, PISA, SCHOOL, SCIENCE ACHIEVEMENT, STUDENTS, TIMSS, cross-survey validation, measurement invariance",
author = "Jia He and Fabi{\'a}n Barrera-pedemonte and Janine Buchholz",
year = "2019",
doi = "10.1080/0969594X.2018.1469467",
language = "English",
volume = "26",
pages = "369--385",
journal = "Assessment in Education: Principles, Policy & Practice",
issn = "0969-594X",
publisher = "Routledge",
number = "4",

}

Cross-cultural comparability of noncognitive constructs in TIMSS and PISA. / He, Jia; Barrera-pedemonte, Fabián; Buchholz, Janine.

In: Assessment in Education: Principles, Policy & Practice, Vol. 26, No. 4, 2019, p. 369-385.

Research output: Contribution to journalArticleScientificpeer-review

TY - JOUR

T1 - Cross-cultural comparability of noncognitive constructs in TIMSS and PISA

AU - He, Jia

AU - Barrera-pedemonte, Fabián

AU - Buchholz, Janine

PY - 2019

Y1 - 2019

N2 - Noncognitive assessments in Programme for International Student Assessment (PISA) and Trends in International Mathematics and Science Study share certain similarities and provide complementary information, yet their comparability is seldom checked and convergence not sought. We made use of student self-report data of Instrumental Motivation, Enjoyment of Science and Sense of Belonging to School targeted in both surveys in 29 overlapping countries to (1) demonstrate levels of measurement comparability, (2) check convergence of different scaling methods within survey and (3) check convergence of these constructs with student achievement across surveys. We found that the three scales in either survey (except Sense of Belonging to School in PISA) reached at least metric invariance. The scale scores from the multigroup confirmatory factor analysis and the item response theory analysis were highly correlated, pointing to robustness of scaling methods. The correlations between each construct and achievement was generally positive within each culture in each survey, and the correlational pattern was similar across surveys (except for Sense of Belonging), indicating certain convergence in the cross-survey validation. We stress the importance of checking measurement invariance before making comparative inferences, and we discuss implications on the quality and relevance of these constructs in understating learning.

AB - Noncognitive assessments in Programme for International Student Assessment (PISA) and Trends in International Mathematics and Science Study share certain similarities and provide complementary information, yet their comparability is seldom checked and convergence not sought. We made use of student self-report data of Instrumental Motivation, Enjoyment of Science and Sense of Belonging to School targeted in both surveys in 29 overlapping countries to (1) demonstrate levels of measurement comparability, (2) check convergence of different scaling methods within survey and (3) check convergence of these constructs with student achievement across surveys. We found that the three scales in either survey (except Sense of Belonging to School in PISA) reached at least metric invariance. The scale scores from the multigroup confirmatory factor analysis and the item response theory analysis were highly correlated, pointing to robustness of scaling methods. The correlations between each construct and achievement was generally positive within each culture in each survey, and the correlational pattern was similar across surveys (except for Sense of Belonging), indicating certain convergence in the cross-survey validation. We stress the importance of checking measurement invariance before making comparative inferences, and we discuss implications on the quality and relevance of these constructs in understating learning.

KW - CONTEXT

KW - MATHEMATICS ACHIEVEMENT

KW - MEASUREMENT INVARIANCE

KW - MOTIVATION

KW - PISA

KW - SCHOOL

KW - SCIENCE ACHIEVEMENT

KW - STUDENTS

KW - TIMSS

KW - cross-survey validation

KW - measurement invariance

UR - https://app-eu.readspeaker.com/cgi-bin/rsent?customerid=10118&lang=en_us&readclass=rs_readArea&url=https%3A%2F%2Fwww.tandfonline.com%2Fdoi%2Ffull%2F10.1080%2F0969594X.2018.1469467

U2 - 10.1080/0969594X.2018.1469467

DO - 10.1080/0969594X.2018.1469467

M3 - Article

VL - 26

SP - 369

EP - 385

JO - Assessment in Education: Principles, Policy & Practice

JF - Assessment in Education: Principles, Policy & Practice

SN - 0969-594X

IS - 4

ER -