Jackknife and Bootstrap Inference in Regression and a Class of Representations for the LSE.
Technical summary rept.,
WISCONSIN UNIV-MADISON MATHEMATICS RESEARCH CENTER
Pagination or Media Count:
A class of representations for the least squares estimator is presented and their applications sketched. Partly motivated by one such representation, the author proposes a class of weighted jackknife estimators of variance of the least squares estimator by deleting any fixed number of observations at a time. These estimators are unbiased for homoscedastic errors and a special case, the delete-one jackknife variance estimator, is almost unbiased for heteroscedastic errors. The method is extended in various ways, including the use of the the jackknife histogram, for variance and interval estimation with nonlinear parameters. Three bootstrap methods are considered. It is shown that none of them has the robustness property enjoyed by the weighted delete-on jackknife. Subset sampling with variable subset size is also considered. Several bias-reducing estimators are proposed. They are motivated by the observation that bias-reduction is mathematically equivalent to unbiased estimation of variance. Some simulation results on estimating the ratio of two normal parameters are reported. Author
- Statistics and Probability