Examining reproducibility in psychology

A hybrid method for combining a statistically significant original study and a replication

Research output: Contribution to journalArticleScientificpeer-review

22 Downloads (Pure)

Abstract

The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter problems when using traditional meta-analysis techniques. The original study’s effect size is most probably overestimated because it is statistically significant, and this bias is not taken into consideration in traditional meta-analysis. We have developed a hybrid method that does take the statistical significance of an original study into account and enables (a) accurate effect size estimation, (b) estimation of a confidence interval, and (c) testing of the null hypothesis of no effect. We analytically approximate the performance of the hybrid method and describe its statistical properties. By applying the hybrid method to data from the Reproducibility Project: Psychology (Open Science Collaboration, 2015), we demonstrate that the conclusions based on the hybrid method are often in line with those of the replication, suggesting that many published psychological studies have smaller effect sizes than those reported in the original study, and that some effects may even be absent. We offer hands-on guidelines for how to statistically combine an original study and replication, and have developed a Web-based application (https://rvanaert.shinyapps.io/hybrid) for applying the hybrid method.
Original languageEnglish
Pages (from-to)1515-1539
JournalBehavior Research Methods
Volume50
Issue number4
DOIs
Publication statusPublished - 2018

Fingerprint

Reproducibility
Psychology
Replication
Confidence Intervals
Effect Size
Meta-analysis
World Wide Web
Null Hypothesis
Psychological
Testing
Confidence Interval

Keywords

  • ATTENTION
  • BIAS
  • CONFLICT
  • DECISION
  • EFFECT SIZE
  • EXCESS
  • IMPLICIT
  • MEMORY
  • METAANALYSIS
  • Meta-analysis
  • Replication
  • Reproducibility
  • SINGLE
  • p-Uniform

Cite this

@article{59e9924d513245c2bbd18a1b14e69b10,
title = "Examining reproducibility in psychology: A hybrid method for combining a statistically significant original study and a replication",
abstract = "The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter problems when using traditional meta-analysis techniques. The original study’s effect size is most probably overestimated because it is statistically significant, and this bias is not taken into consideration in traditional meta-analysis. We have developed a hybrid method that does take the statistical significance of an original study into account and enables (a) accurate effect size estimation, (b) estimation of a confidence interval, and (c) testing of the null hypothesis of no effect. We analytically approximate the performance of the hybrid method and describe its statistical properties. By applying the hybrid method to data from the Reproducibility Project: Psychology (Open Science Collaboration, 2015), we demonstrate that the conclusions based on the hybrid method are often in line with those of the replication, suggesting that many published psychological studies have smaller effect sizes than those reported in the original study, and that some effects may even be absent. We offer hands-on guidelines for how to statistically combine an original study and replication, and have developed a Web-based application (https://rvanaert.shinyapps.io/hybrid) for applying the hybrid method.",
keywords = "ATTENTION, BIAS, CONFLICT, DECISION, EFFECT SIZE, EXCESS, IMPLICIT, MEMORY, METAANALYSIS, Meta-analysis, Replication, Reproducibility, SINGLE, p-Uniform",
author = "{Van Aert}, R.C.M. and {Van Assen}, M.A.L.M.",
year = "2018",
doi = "10.3758/s13428-017-0967-6",
language = "English",
volume = "50",
pages = "1515--1539",
journal = "Behavior Research Methods",
issn = "1554-351X",
publisher = "Springer",
number = "4",

}

TY - JOUR

T1 - Examining reproducibility in psychology

T2 - A hybrid method for combining a statistically significant original study and a replication

AU - Van Aert, R.C.M.

AU - Van Assen, M.A.L.M.

PY - 2018

Y1 - 2018

N2 - The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter problems when using traditional meta-analysis techniques. The original study’s effect size is most probably overestimated because it is statistically significant, and this bias is not taken into consideration in traditional meta-analysis. We have developed a hybrid method that does take the statistical significance of an original study into account and enables (a) accurate effect size estimation, (b) estimation of a confidence interval, and (c) testing of the null hypothesis of no effect. We analytically approximate the performance of the hybrid method and describe its statistical properties. By applying the hybrid method to data from the Reproducibility Project: Psychology (Open Science Collaboration, 2015), we demonstrate that the conclusions based on the hybrid method are often in line with those of the replication, suggesting that many published psychological studies have smaller effect sizes than those reported in the original study, and that some effects may even be absent. We offer hands-on guidelines for how to statistically combine an original study and replication, and have developed a Web-based application (https://rvanaert.shinyapps.io/hybrid) for applying the hybrid method.

AB - The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter problems when using traditional meta-analysis techniques. The original study’s effect size is most probably overestimated because it is statistically significant, and this bias is not taken into consideration in traditional meta-analysis. We have developed a hybrid method that does take the statistical significance of an original study into account and enables (a) accurate effect size estimation, (b) estimation of a confidence interval, and (c) testing of the null hypothesis of no effect. We analytically approximate the performance of the hybrid method and describe its statistical properties. By applying the hybrid method to data from the Reproducibility Project: Psychology (Open Science Collaboration, 2015), we demonstrate that the conclusions based on the hybrid method are often in line with those of the replication, suggesting that many published psychological studies have smaller effect sizes than those reported in the original study, and that some effects may even be absent. We offer hands-on guidelines for how to statistically combine an original study and replication, and have developed a Web-based application (https://rvanaert.shinyapps.io/hybrid) for applying the hybrid method.

KW - ATTENTION

KW - BIAS

KW - CONFLICT

KW - DECISION

KW - EFFECT SIZE

KW - EXCESS

KW - IMPLICIT

KW - MEMORY

KW - METAANALYSIS

KW - Meta-analysis

KW - Replication

KW - Reproducibility

KW - SINGLE

KW - p-Uniform

U2 - 10.3758/s13428-017-0967-6

DO - 10.3758/s13428-017-0967-6

M3 - Article

VL - 50

SP - 1515

EP - 1539

JO - Behavior Research Methods

JF - Behavior Research Methods

SN - 1554-351X

IS - 4

ER -