The replication paradox

Combining studies can decrease accuracy of effect size estimates

Research output: Contribution to journalArticleScientificpeer-review

41 Downloads (Pure)

Abstract

Replication is often viewed as the demarcation between science and nonscience. However, contrary to the commonly held view, we show that in the current (selective) publication system replications may increase bias in effect size estimates. Specifically, we examine the effect of replication on bias in estimated population effect size as a function of publication bias and the studies' sample size or power. We analytically show that incorporating the results of published replication studies will in general not lead to less bias in the estimated population effect size. We therefore conclude that mere replication will not solve the problem of overestimation of effect sizes. We will discuss the implications of our findings for interpreting results of published and unpublished studies, and for conducting and interpreting results of meta-analyses. We also discuss solutions for the problem of overestimation of effect sizes, such as discarding and not publishing small studies with low power, and implementing practices that completely eliminate publication bias (e.g., study registration).
Original languageEnglish
Pages (from-to)172-182
JournalReview of General Psychology
Volume19
Issue number2
DOIs
Publication statusPublished - 2015

Keywords

  • replication
  • effect size
  • publication bias
  • power
  • meta-analysis

Cite this

@article{d78a1f062a3145bcb782b1c165ba178f,
title = "The replication paradox: Combining studies can decrease accuracy of effect size estimates",
abstract = "Replication is often viewed as the demarcation between science and nonscience. However, contrary to the commonly held view, we show that in the current (selective) publication system replications may increase bias in effect size estimates. Specifically, we examine the effect of replication on bias in estimated population effect size as a function of publication bias and the studies' sample size or power. We analytically show that incorporating the results of published replication studies will in general not lead to less bias in the estimated population effect size. We therefore conclude that mere replication will not solve the problem of overestimation of effect sizes. We will discuss the implications of our findings for interpreting results of published and unpublished studies, and for conducting and interpreting results of meta-analyses. We also discuss solutions for the problem of overestimation of effect sizes, such as discarding and not publishing small studies with low power, and implementing practices that completely eliminate publication bias (e.g., study registration).",
keywords = "replication, effect size, publication bias, power, meta-analysis",
author = "Nuijten, {Michele B.} and {van Assen}, {Marcel A. L. M.} and Veldkamp, {Coosje L. S.} and Wicherts, {Jelte M.}",
year = "2015",
doi = "10.1037/gpr0000034",
language = "English",
volume = "19",
pages = "172--182",
journal = "Review of General Psychology",
issn = "1089-2680",
publisher = "American Psychological Association",
number = "2",

}

The replication paradox : Combining studies can decrease accuracy of effect size estimates. / Nuijten, Michele B.; van Assen, Marcel A. L. M.; Veldkamp, Coosje L. S.; Wicherts, Jelte M.

In: Review of General Psychology, Vol. 19, No. 2, 2015, p. 172-182.

Research output: Contribution to journalArticleScientificpeer-review

TY - JOUR

T1 - The replication paradox

T2 - Combining studies can decrease accuracy of effect size estimates

AU - Nuijten, Michele B.

AU - van Assen, Marcel A. L. M.

AU - Veldkamp, Coosje L. S.

AU - Wicherts, Jelte M.

PY - 2015

Y1 - 2015

N2 - Replication is often viewed as the demarcation between science and nonscience. However, contrary to the commonly held view, we show that in the current (selective) publication system replications may increase bias in effect size estimates. Specifically, we examine the effect of replication on bias in estimated population effect size as a function of publication bias and the studies' sample size or power. We analytically show that incorporating the results of published replication studies will in general not lead to less bias in the estimated population effect size. We therefore conclude that mere replication will not solve the problem of overestimation of effect sizes. We will discuss the implications of our findings for interpreting results of published and unpublished studies, and for conducting and interpreting results of meta-analyses. We also discuss solutions for the problem of overestimation of effect sizes, such as discarding and not publishing small studies with low power, and implementing practices that completely eliminate publication bias (e.g., study registration).

AB - Replication is often viewed as the demarcation between science and nonscience. However, contrary to the commonly held view, we show that in the current (selective) publication system replications may increase bias in effect size estimates. Specifically, we examine the effect of replication on bias in estimated population effect size as a function of publication bias and the studies' sample size or power. We analytically show that incorporating the results of published replication studies will in general not lead to less bias in the estimated population effect size. We therefore conclude that mere replication will not solve the problem of overestimation of effect sizes. We will discuss the implications of our findings for interpreting results of published and unpublished studies, and for conducting and interpreting results of meta-analyses. We also discuss solutions for the problem of overestimation of effect sizes, such as discarding and not publishing small studies with low power, and implementing practices that completely eliminate publication bias (e.g., study registration).

KW - replication

KW - effect size

KW - publication bias

KW - power

KW - meta-analysis

U2 - 10.1037/gpr0000034

DO - 10.1037/gpr0000034

M3 - Article

VL - 19

SP - 172

EP - 182

JO - Review of General Psychology

JF - Review of General Psychology

SN - 1089-2680

IS - 2

ER -