Reproducibility of individual effect sizes in meta-analyses in psychology

Research output: Working paperOther research output

Abstract

To determine the reproducibility of psychological meta-analyses, we investigated whether we could reproduce 500 primary study effect sizes drawn from 33 published meta-analyses based on the information given in the meta-analyses, and whether recomputations of primary study effect sizes altered the overall results of the meta-analysis. Results showed that almost half (k = 224) of all sampled primary effect sizes could not be reproduced based on the reported information in the meta-analysis, mostly because of incomplete or missing information on how effect sizes from primary studies were selected and computed. Overall, this led to small discrepancies in the computation of mean effect sizes, confidence intervals and heterogeneity estimates in 13 out of 33 meta-analyses. We provide recommendations to improve transparancy in the reporting of the entire meta-analytic process, including the use of preregistration, data and workflow sharing, and explicit coding practices.
Original languageEnglish
PublisherPsyArXiv Preprints
DOIs
Publication statusPublished - 17 Oct 2019

Fingerprint

Workflow
Information Dissemination
Confidence Intervals

Keywords

  • meta-analysis
  • reproducibility
  • reporting errors
  • effect sizes

Cite this

@techreport{4f3481cab56c4bb2a54a9115a2c162c7,
title = "Reproducibility of individual effect sizes in meta-analyses in psychology",
abstract = "To determine the reproducibility of psychological meta-analyses, we investigated whether we could reproduce 500 primary study effect sizes drawn from 33 published meta-analyses based on the information given in the meta-analyses, and whether recomputations of primary study effect sizes altered the overall results of the meta-analysis. Results showed that almost half (k = 224) of all sampled primary effect sizes could not be reproduced based on the reported information in the meta-analysis, mostly because of incomplete or missing information on how effect sizes from primary studies were selected and computed. Overall, this led to small discrepancies in the computation of mean effect sizes, confidence intervals and heterogeneity estimates in 13 out of 33 meta-analyses. We provide recommendations to improve transparancy in the reporting of the entire meta-analytic process, including the use of preregistration, data and workflow sharing, and explicit coding practices.",
keywords = "meta-analysis, reproducibility, reporting errors, effect sizes",
author = "Esther Maassen and {van Assen}, Marcel and Mich{\`e}le Nuijten and {Olsson Collentine}, Anton and Jelte Wicherts",
year = "2019",
month = "10",
day = "17",
doi = "10.31234/osf.io/g5ryh",
language = "English",
publisher = "PsyArXiv Preprints",
type = "WorkingPaper",
institution = "PsyArXiv Preprints",

}

TY - UNPB

T1 - Reproducibility of individual effect sizes in meta-analyses in psychology

AU - Maassen, Esther

AU - van Assen, Marcel

AU - Nuijten, Michèle

AU - Olsson Collentine, Anton

AU - Wicherts, Jelte

PY - 2019/10/17

Y1 - 2019/10/17

N2 - To determine the reproducibility of psychological meta-analyses, we investigated whether we could reproduce 500 primary study effect sizes drawn from 33 published meta-analyses based on the information given in the meta-analyses, and whether recomputations of primary study effect sizes altered the overall results of the meta-analysis. Results showed that almost half (k = 224) of all sampled primary effect sizes could not be reproduced based on the reported information in the meta-analysis, mostly because of incomplete or missing information on how effect sizes from primary studies were selected and computed. Overall, this led to small discrepancies in the computation of mean effect sizes, confidence intervals and heterogeneity estimates in 13 out of 33 meta-analyses. We provide recommendations to improve transparancy in the reporting of the entire meta-analytic process, including the use of preregistration, data and workflow sharing, and explicit coding practices.

AB - To determine the reproducibility of psychological meta-analyses, we investigated whether we could reproduce 500 primary study effect sizes drawn from 33 published meta-analyses based on the information given in the meta-analyses, and whether recomputations of primary study effect sizes altered the overall results of the meta-analysis. Results showed that almost half (k = 224) of all sampled primary effect sizes could not be reproduced based on the reported information in the meta-analysis, mostly because of incomplete or missing information on how effect sizes from primary studies were selected and computed. Overall, this led to small discrepancies in the computation of mean effect sizes, confidence intervals and heterogeneity estimates in 13 out of 33 meta-analyses. We provide recommendations to improve transparancy in the reporting of the entire meta-analytic process, including the use of preregistration, data and workflow sharing, and explicit coding practices.

KW - meta-analysis

KW - reproducibility

KW - reporting errors

KW - effect sizes

U2 - 10.31234/osf.io/g5ryh

DO - 10.31234/osf.io/g5ryh

M3 - Working paper

BT - Reproducibility of individual effect sizes in meta-analyses in psychology

PB - PsyArXiv Preprints

ER -