Honey, I shrunk the irrelevant effects! Simple and flexible approximate Bayesian regularization

Diana Karimova, Sara van Erp, Roger Th. A. J. Leenders, Joris Mulder*

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

Abstract

In the social and behavioral sciences and related fields, statistical models are becoming increasingly complex with more parameters to explain intricate dependency structures among larger sets of variables. Regularization techniques, like penalized regression, help identify key parameters by shrinking negligible effects to zero, resulting in parsimonious solutions with strong predictive performance. This paper introduces a simple and flexible approximate Bayesian regularization (ABR) procedure, combining a Gaussian approximation of the likelihood with a Bayesian shrinkage prior to obtain a regularized posterior. Parsimonious (interpretable) solutions are obtained by taking the posterior modes. Parameter uncertainty is quantified using the full posterior. Implemented in the R package shrinkem, the method is evaluated in synthetic and empirical applications. Its flexibility is demonstrated across various models, including linear regression, relational event models, mediation analysis, factor analysis, and Gaussian graphical models.
Original languageEnglish
Article number102925
Number of pages16
JournalJournal of Mathematical Psychology
Volume126
DOIs
Publication statusPublished - Aug 2025

Keywords

  • Bayesian analysis
  • Penalized regression
  • Psychological modeling
  • Statistical regularization
  • Variable selection

Fingerprint

Dive into the research topics of 'Honey, I shrunk the irrelevant effects! Simple and flexible approximate Bayesian regularization'. Together they form a unique fingerprint.

Cite this