Skip to main content
Open Access Publications from the University of California

Minimum Divergence Moment Based Binary Response Models: Estimation and Inference


This paper introduces a new class of estimators based on minimization of the Cressie-Read (CR)power divergence measure for binary choice models, where neither a parameterized distribution nor a parameterization of the mean is specified explicitly in the statistical model. By incorporating sample information in the form of conditional moment conditions and estimating choice probabilities by optimizing a member of the set of divergence measures in the CR family, a new class of nonparametric estimators evolves that requires less a priori model structure than conventional parametric estimators such as probit or logit. Asymptotic properties are derived under general regularity conditions and finite sampling properties are illustrated by Monte Carlo sampling experiments. Except for some special cases in which the general regularity conditions do not hold, the estimators have asymptotic normal distributions, similar to conventional parametric estimators of the binary choice model. The sampling experiments focus on the mean square errors in the choice probability predictions and the probability derivatives with respect to the response variable values. The simulation results suggest that estimators within the CR class are more robust than conventional methods of estimation across varying probability distributions underlying the Bernoulli process. The size and power of test statistics based on the asymptotics of the CR-based estimators exhibit behavior similar to those based on conventional parametric methods. Overall, the new class of nonparametric estimators for the binary response model is a promising and potentially more robust alternative to the arametric methods often used in empirical practice.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View