"False"
Skip to content
printicon
Main menu hidden.

David Källberg, Umeå University

Tue
8
Oct
Time Tuesday 8 October, 2019 at 13:15 - 14:15
Place MA346, MIT building

Speaker: David Källberg, Department of Statistics, Umeå School of Business, Economics and Statistics

Title: Entropy balancing for estimation of average causal effects

Abstract:

In observational studies with binary treatments, weighting methods are used to adjust for covariate imbalances between treated and controls. Entropy balancing estimators have been proposed as an alternative to inverse probability weighting with an estimated propensity score. Here, the researcher specifies a set of balance constraints for the observed covariates and employs a scheme that searches for a set of weights that minimize the divergence relative to a set of uniform base weights. We describe large sample properties of entropy balancing estimators based on the Kullback-Leibler divergence and quadratic order Rényi divergence, respectively, and propose plug-in estimators of their respective asymptotic variances. Even though the objective of entropy balancing is to reduce the dependence on a propensity score model the two estimators entail implicit parametric functional forms for the propensity score. The parametric models imply that the estimators are generally not consistent for the true causal effect unless the models are correct. However, similar to the class of augmented inverse probability estimators we show consistency and asymptotic normality of the estimators given that the potential outcome means conditional on the covariates are a linear combination of the constraints selected in the balancing scheme. Moreover, the propensity score and outcome model assumptions can be made separately for the treated and controls providing more robustness results. We investigate the finite sample properties of the estimators by conducting a simulation study replicating a design from the literature.

 

Event type: Seminar