Gradient and Gradient-Free Methods for Stochastic Convex Optimization with Inexact Oracle

Gradient and Gradient-Free Methods for Stochastic Convex Optimization with Inexact Oracle

Feb 22, 2015·
Alexander Gasnikov
,
Pavel Dvurechensky
Dmitry Kamzolov
Dmitry Kamzolov
· 0 min read
Abstract
In the paper we generalize universal gradient method (Yu. Nesterov) to strongly convex case and to Intermediate gradient method (Devolder-Glineur-Nesterov). We also consider possible generalizations to stochastic and online context. We show how these results can be generalized to gradient-free method and method of random direction search. But the main ingridient of this paper is assumption about the oracle. We considered the oracle to be inexact.
Type
Publication
In Conference on System Dynamics and Control Processes (SDCP2014)