A Combined Biased-Robust Estimator for Dealing with Influence and Collinearity in Regression
AIR FORCE INST OF TECH WRIGHT-PATTERSONAFB OH
Pagination or Media Count:
Regression analysis is a statistical tool that has earned widespread use in nearly all areas of endeavor seeking to fit a model to a set of data, Although there are several methods of estimating the model parameters, the least squares method is used most often because of its general acceptance, elegant statistical properties and ease of computation. Unfortunately, the mathematical elegance that makes least squares so popular depends on a number of fairly strong and many times unrealistic assumptions. The assumption that makes least squares so attractive in terms of hypothesis testing and confidence intervals on the parameter estimates is that the distribution of the errors is normal or Gaussian. This assumption can be violated if one or more sufficiently outlying observations are present in the data, resulting in less than optimal estimates of the parameters. The second problem that can ruin the accuracy of least squares estimates is correlated regressors. Highly correlated regressors can cause large variances in the estimates of the coefficients, sometimes resulting in incorrect levels of magnitude or even incorrect signs for the coefficients. The objective of this research is to develop a biased-robust regression estimator and determine how the method performs in the presence of nonnormal errors outliers and multicollinear regressor variables. To accomplish this major objective a number of investigative questions must be answered.
- Statistics and Probability