Trimmed Likelihood-based Estimation in Binary Regression Models

Research output: Working paperDiscussion paperOther research output

426 Downloads (Pure)

Abstract

The binary-choice regression models such as probit and logit are typically estimated by the maximum likelihood method.To improve its robustness, various M-estimation based procedures were proposed, which however require bias corrections to achieve consistency and their resistance to outliers is relatively low.On the contrary, traditional high-breakdown point methods such as maximum trimmed likelihood are not applicable since they induce the separation of data and thus non-identification of estimates by trimming observations.We propose a new robust estimator of binary-choice models based on a maximum symmetrically trimmed likelihood estimator.It is proved to be identified and consistent, and additionally, it does not create separation in the space of explanatory variables as the existing maximum trimmed likelihood.We also discuss asymptotic and robust properties of the proposed method and compare all methods by means of Monte Carlo simulations.
Original languageEnglish
Place of PublicationTilburg
PublisherEconometrics
Number of pages11
Volume2005-108
Publication statusPublished - 2005

Publication series

NameCentER Discussion Paper
Volume2005-108

Keywords

  • regression analysis
  • maximum likelihood
  • binary-choice regression
  • robust estimation
  • trimming

Fingerprint

Dive into the research topics of 'Trimmed Likelihood-based Estimation in Binary Regression Models'. Together they form a unique fingerprint.

Cite this