A Computational Examination of Orthogonal Distance Regression.
COLORADO UNIV AT BOULDER DEPT OF COMPUTER SCIENCE
Pagination or Media Count:
Classical or ordinary least squares OLS is one of the most commonly used criteria for fitting data to models and for estimating parameters. This is true when a key assumption for its use, namely that the independent variables are known exactly, is violated. Orthogonal distance regression ODR extends least squares data fitting to problems with the independent variables that are not known exactly. Theoretical analysis, however, shows OLS is preferable to ODR for straight line functions under certain conditions, even when there are measurement errors in the independent variable. This has lead some to conjecture that under some similar conditions OLS will also be preferable to ODR for nonlinear functions even though there are errors in the independent variable. This paper presents the results of an empirical study designed to examine whether ODR provides better results than OLS when there are errors in the independent variable. Examined are a variety of functions, both linear and nonlinear, under a variety of experimental conditions. The results indicate that for the data and performance criteria considered, ODR never performs appreciably worse than OLS and sometimes performs considerably better. This leads to the conclusion that ODR is appropriate for a wide variety of practical problems. Author
- Statistics and Probability