Robust Sandwich Covariate Estimate: Difference between revisions

Jump to navigation Jump to search
No edit summary
 
Line 1: Line 1:
 
__NOTOC__
==Overview==
==Overview==
The '''Robust Sandwich Covariate Estimate''', also known as the Huber Sandwich Estimator and the Robust Sandwich Estimator, can be used to estimate the variance of an [[least squares|ordinary least squares]] (OLS) regression when the underlying model is incorrect, such as in the case of [[homoscedasticity]].  The Sandwich Estimator is generally only used when the OLE model has serious error and may benefit the variance, but the parameters being estimated by the OLE become meaningless to interpret <ref>Huber, P. J. (1967). “The Behavior of Maximum Likelihood Estimates under Nonstandard Conditions,” Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, vol. I, pp. 221–33.</ref>.
The '''Robust Sandwich Covariate Estimate''', also known as the Huber Sandwich Estimator and the Robust Sandwich Estimator, can be used to estimate the variance of an [[least squares|ordinary least squares]] (OLS) regression when the underlying model is incorrect, such as in the case of [[homoscedasticity]].  The Sandwich Estimator is generally only used when the OLE model has serious error and may benefit the variance, but the parameters being estimated by the OLE become meaningless to interpret <ref>Huber, P. J. (1967). “The Behavior of Maximum Likelihood Estimates under Nonstandard Conditions,” Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, vol. I, pp. 221–33.</ref>.

Latest revision as of 12:11, 28 March 2016

Overview

The Robust Sandwich Covariate Estimate, also known as the Huber Sandwich Estimator and the Robust Sandwich Estimator, can be used to estimate the variance of an ordinary least squares (OLS) regression when the underlying model is incorrect, such as in the case of homoscedasticity. The Sandwich Estimator is generally only used when the OLE model has serious error and may benefit the variance, but the parameters being estimated by the OLE become meaningless to interpret [1].

When to use it?

When trying to estimate standard error from data where heteroscedasticity is present (referring to the circumference in which the variability of a variable is unequal across the range of values of a second variable that it predicts). When estimating from heteroscedastic data, the ordinal least squares estimator will be consistent, but the estimated standard errors will be biased due to inequality in the dependent variables value across the independent variable.

How does it work?

For OLS, you can imagine that you're using the estimated variance of the residuals (under the assumption of independence and homoscedasticity) as an estimate for the conditional variance of the dependent variable. In the sandwich based estimator, you're using the observed squared residuals as a plug-in estimate of the same variance which can vary between observations. By using the square of the residuals, we get consistent estimates for the variance of the independent variables (the coefficients used in the model). These observed squared residuals will clean up any unexplained error due to heteroscedasticity that would otherwise been unexpected under the assumption of the constant covariance.

References

  1. Huber, P. J. (1967). “The Behavior of Maximum Likelihood Estimates under Nonstandard Conditions,” Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, vol. I, pp. 221–33.