Cross-cultural comparability of noncognitive constructs in TIMSS and PISA

Jia He, Fabián Barrera-pedemonte, Janine Buchholz

Research output: Contribution to journalArticleScientificpeer-review

Abstract

Noncognitive assessments in Programme for International Student Assessment (PISA) and Trends in International Mathematics and Science Study share certain similarities and provide complementary information, yet their comparability is seldom checked and convergence not sought. We made use of student self-report data of Instrumental Motivation, Enjoyment of Science and Sense of Belonging to School targeted in both surveys in 29 overlapping countries to (1) demonstrate levels of measurement comparability, (2) check convergence of different scaling methods within survey and (3) check convergence of these constructs with student achievement across surveys. We found that the three scales in either survey (except Sense of Belonging to School in PISA) reached at least metric invariance. The scale scores from the multigroup confirmatory factor analysis and the item response theory analysis were highly correlated, pointing to robustness of scaling methods. The correlations between each construct and achievement was generally positive within each culture in each survey, and the correlational pattern was similar across surveys (except for Sense of Belonging), indicating certain convergence in the cross-survey validation. We stress the importance of checking measurement invariance before making comparative inferences, and we discuss implications on the quality and relevance of these constructs in understating learning.
Original languageEnglish
Pages (from-to)1-17
JournalAssessment in Education: Principles, Policy & Practice
DOIs
Publication statusE-pub ahead of print - 2019

Fingerprint

PISA study
scaling
mathematics studies
science studies
school
factor analysis
student
trend
science
learning

Cite this

@article{e49b9446bfec4429bddeaef2a23e4ca2,
title = "Cross-cultural comparability of noncognitive constructs in TIMSS and PISA",
abstract = "Noncognitive assessments in Programme for International Student Assessment (PISA) and Trends in International Mathematics and Science Study share certain similarities and provide complementary information, yet their comparability is seldom checked and convergence not sought. We made use of student self-report data of Instrumental Motivation, Enjoyment of Science and Sense of Belonging to School targeted in both surveys in 29 overlapping countries to (1) demonstrate levels of measurement comparability, (2) check convergence of different scaling methods within survey and (3) check convergence of these constructs with student achievement across surveys. We found that the three scales in either survey (except Sense of Belonging to School in PISA) reached at least metric invariance. The scale scores from the multigroup confirmatory factor analysis and the item response theory analysis were highly correlated, pointing to robustness of scaling methods. The correlations between each construct and achievement was generally positive within each culture in each survey, and the correlational pattern was similar across surveys (except for Sense of Belonging), indicating certain convergence in the cross-survey validation. We stress the importance of checking measurement invariance before making comparative inferences, and we discuss implications on the quality and relevance of these constructs in understating learning.",
author = "Jia He and Fabi{\'a}n Barrera-pedemonte and Janine Buchholz",
year = "2019",
doi = "10.1080/0969594X.2018.1469467",
language = "English",
pages = "1--17",
journal = "Assessment in Education: Principles, Policy & Practice",
issn = "0969-594X",
publisher = "Routledge",

}

Cross-cultural comparability of noncognitive constructs in TIMSS and PISA. / He, Jia; Barrera-pedemonte, Fabián; Buchholz, Janine.

In: Assessment in Education: Principles, Policy & Practice, 2019, p. 1-17.

Research output: Contribution to journalArticleScientificpeer-review

TY - JOUR

T1 - Cross-cultural comparability of noncognitive constructs in TIMSS and PISA

AU - He, Jia

AU - Barrera-pedemonte, Fabián

AU - Buchholz, Janine

PY - 2019

Y1 - 2019

N2 - Noncognitive assessments in Programme for International Student Assessment (PISA) and Trends in International Mathematics and Science Study share certain similarities and provide complementary information, yet their comparability is seldom checked and convergence not sought. We made use of student self-report data of Instrumental Motivation, Enjoyment of Science and Sense of Belonging to School targeted in both surveys in 29 overlapping countries to (1) demonstrate levels of measurement comparability, (2) check convergence of different scaling methods within survey and (3) check convergence of these constructs with student achievement across surveys. We found that the three scales in either survey (except Sense of Belonging to School in PISA) reached at least metric invariance. The scale scores from the multigroup confirmatory factor analysis and the item response theory analysis were highly correlated, pointing to robustness of scaling methods. The correlations between each construct and achievement was generally positive within each culture in each survey, and the correlational pattern was similar across surveys (except for Sense of Belonging), indicating certain convergence in the cross-survey validation. We stress the importance of checking measurement invariance before making comparative inferences, and we discuss implications on the quality and relevance of these constructs in understating learning.

AB - Noncognitive assessments in Programme for International Student Assessment (PISA) and Trends in International Mathematics and Science Study share certain similarities and provide complementary information, yet their comparability is seldom checked and convergence not sought. We made use of student self-report data of Instrumental Motivation, Enjoyment of Science and Sense of Belonging to School targeted in both surveys in 29 overlapping countries to (1) demonstrate levels of measurement comparability, (2) check convergence of different scaling methods within survey and (3) check convergence of these constructs with student achievement across surveys. We found that the three scales in either survey (except Sense of Belonging to School in PISA) reached at least metric invariance. The scale scores from the multigroup confirmatory factor analysis and the item response theory analysis were highly correlated, pointing to robustness of scaling methods. The correlations between each construct and achievement was generally positive within each culture in each survey, and the correlational pattern was similar across surveys (except for Sense of Belonging), indicating certain convergence in the cross-survey validation. We stress the importance of checking measurement invariance before making comparative inferences, and we discuss implications on the quality and relevance of these constructs in understating learning.

U2 - 10.1080/0969594X.2018.1469467

DO - 10.1080/0969594X.2018.1469467

M3 - Article

SP - 1

EP - 17

JO - Assessment in Education: Principles, Policy & Practice

JF - Assessment in Education: Principles, Policy & Practice

SN - 0969-594X

ER -