# Accession Number:

## AD0248660

# Title:

## CENTRAL LIMIT THEOREM AND CONSISTENCY IN LINEAR REGRESSION

# Descriptive Note:

# Corporate Author:

## NORTH CAROLINA UNIV AT CHAPEL HILL DEPT OF STATISTICS

# Personal Author(s):

# Report Date:

## 1960-12-01

# Pagination or Media Count:

## 1.0

# Abstract:

Several asymptotic properties of the least squares estimators for the parameters in the linear regression model with non-identical, independent errors are derived with regard to infinitely increasing sample size. The notion of convergence of a sequence of random variables b on a set F is introduced and applied, i.e., the b depend functionally on an arbitrarily chosen sequence of random variables all of which are elements of the set F, and they converge for over each such sequence. If F is any set of random variables with zero means and bounded variances which contains at least one normal variable, then there exists a simple necessary and sufficient condition for the regression matrix X such that the estimators are consistent on F. For asymptotic normality of essentially these estimators on any subset G in the set of all zero-mean random variables whose variances exist, necessary and sufficient conditions are given both for X and the set G simultaneously to be fulfilled. This is the central limit theorem for the linear regression model. For the case of unknown error variances, a statistic is constructed, and sufficient conditions for X and G are given to assure its asymptotic normality on G. One important aspect of the theory developed is its non-parametric character and the width of the range of admitted error distributions. Some examples and illustrations are given. Author