A Stable and Efficient Algorithm for Nonlinear Orthogonal Distance Regression.
COLORADO UNIV AT BOULDER DEPT OF COMPUTER SCIENCE
Pagination or Media Count:
One of the most widely used methodologies in scientific and engineering research is the fitting of equations to data by least squares. In cases where significant observation errors exist in all data independent variables, however, the ordinary least squares approach, where all errors are attributed to the observation dependent variable, is often inappropriate. An alternate approach, suggested by several researchers, involves minimizing the sum of squared orthogonal distances between each data point and the curve described by the model equation. We refer to this as orthogonal distance regression ODR. This paper describes a method for solving the orthogonal distance regression problem that is a direct analog of the trust region Levenberg-Marquardt algorithm. The number of unknowns involved is the number of model parameters plus the number of data points, often a very large number. By exploiting sparsity, however, our algorithm has a computational effort per step which is of the same order as required for the Levenberg-Marquardt method for ordinary least squares. We prove our algorithm to be globally and locally convergent, and perform computational tests that illustrate some differences between the two approaches. Author
- Statistics and Probability