A Convergence Analysis of the Perturbed Compositional Gradient Flow: Averaging Principle and Normal Deviations


We consider in this work a system of two stochastic differential equations named the perturbed compositional gradient flow. By introducing a separation of fast and slow scales of the two equations, we show that the limit of the slow motion is given by an averaged ordinary differential equation. We then demonstrate that the deviation of the slow motion from the averaged equation, after proper rescaling, converges to a stochastic process with Gaussian inputs. This indicates that the slow motion can be approximated in the weak sense by a standard perturbed gradient flow or the continuous-time stochastic gradient descent algorithm that solves the optimization problem for a composition of two functions. As an application, the perturbed compositional gradient flow corresponds to the diffusion limit of the Stochastic Composite Gradient Descent (SCGD) algorithm for minimizing a composition of two expected-value functions in the optimization literatures. For the strongly convex case, such an analysis implies that the SCGD algorithm has the same convergence time asymptotic as the classical stochastic gradient descent algorithm. Thus it validates, at the level of continuous approximation, the effectiveness of using the SCGD algorithm in the strongly convex case.


Mathematics and Statistics

Keywords and Phrases

And phrases. Perturbed compositional gradient flow; Averaging principle; Fast-slow dynamical systems; Normal deviation; Perturbed gradient flow; Stochastic composite gradient descent; Stochastic gradient descent

International Standard Serial Number (ISSN)

1078-0947; 1553-5231

Document Type

Article - Journal

Document Version


File Type





© 2018 American Institute of Mathematical Sciences (AIMS), All rights reserved.

Publication Date

01 Oct 2018