On the Global Convergence of Continuous-Time Stochastic Heavy-Ball Method for Nonconvex Optimization
Abstract
We study the convergence behavior of a stochastic heavy-ball method with a small stepsize. Under a change of time scale, we approximate the discrete scheme by a stochastic differential equation that models small random perturbations of a coupled system of nonlinear oscillators. We rigorously show that the perturbed system converges to a local minimum in a logarithmic time. This indicates that for the diffusion process that approximates the stochastic heavy-ball method, it takes (up to a logarithmic factor) only a linear time of the square root of the inverse stepsize to escape from all saddle points. This results may suggest a fast convergence of its discrete-time counterpart. Our theoretical results are validated by numerical experiments.
Recommended Citation
W. Hu et al., "On the Global Convergence of Continuous-Time Stochastic Heavy-Ball Method for Nonconvex Optimization," Proceedings of the 2019 IEEE International Conference on Big Data (2019, Los Angeles, CA), pp. 94 - 104, Institute of Electrical and Electronics Engineers (IEEE), Dec 2019.
The definitive version is available at https://doi.org/10.1109/BigData47090.2019.9005621
Meeting Name
2019 IEEE International Conference on Big Data, Big Data 2019 (2019: Dec. 9-12, Los Angeles, CA)
Department(s)
Mathematics and Statistics
Keywords and Phrases
Dissipative Nonlinear Oscillator; Heavy-Ball Method; Non-Convex Optimization; Saddle Point; Small Random Perturbations of Hamiltonian Systems
International Standard Book Number (ISBN)
978-172810858-2
Document Type
Article - Conference proceedings
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2019 Institute of Electrical and Electronics Engineers (IEEE), All rights reserved.
Publication Date
01 Dec 2019