Accession Number:

AD1023882

Title:

Convergence Rates of Finite Difference Stochastic Approximation Algorithms

Descriptive Note:

Technical Report

Corporate Author:

North Carolina State University Raleigh United States

Personal Author(s):

Report Date:

2016-06-01

Pagination or Media Count:

38.0

Abstract:

Recently there has been renewed interests in derivative free approaches to stochastic optimization. In this paper, we examine the rates of convergence for the Kiefer-Wolfowitz algorithm and the mirror descent algorithm, under various updating schemes using finite differences as gradient approximations. It is shown that the convergence of these algorithms can be accelerated by controlling the implementation of the finite differences. Particularly, it is shown that the rate can be increased to n-25 in general and to n-12 in Monte Carlo optimization for a broad class of problems, in the iteration number n.

Subject Categories:

Distribution Statement:

APPROVED FOR PUBLIC RELEASE