Accession Number:

ADA456685

Title:

Optimal Rates for Regularization Operators in Learning Theory

Descriptive Note:

Technical rept.

Corporate Author:

MASSACHUSETTS INST OF TECH CAMBRIDGE COMPUTER SCIENCE AND ARTIFICIAL INTELLIGENCE LAB

Personal Author(s):

Report Date:

2006-09-10

Pagination or Media Count:

19.0

Abstract:

We develop some new error bounds for learning algorithms induced by regularization methods in the regression setting. The hardness of the problem is characterized in terms of the parameters r and s, the first related to the complexity of the target function, the second connected to the effective dimension of the marginal probability measure over the input space. We show, extending previous results, that by a suitable choice of the regularization parameter as a function of the number of the available examples, it is possible attain the optimal minimax rates of convergence for the expected squared loss of the estimators, over the family of priors fulfilling the constraint r s 12. The setting considers both labelled and unlabelled examples, the latter being crucial for the optimality results on the priors in the range r 12 .

Subject Categories:

  • Statistics and Probability
  • Computer Programming and Software

Distribution Statement:

APPROVED FOR PUBLIC RELEASE