Abstract
Stochastic compositional optimization arises in many important machine learning applications. The objective function is the composition of two expectations of stochastic functions, and is more challenging to optimize than vanilla stochastic optimization problems. In this paper, we investigate the stochastic compositional optimization in the general smooth non-convex setting. We employ a recently developed idea of Stochastic Recursive Gradient Descent to design a novel algorithm named SARAH-Compositional, and prove a sharp Incremental First-order Oracle (IFO) complexity upper bound for stochastic compositional optimization: 𝒪((n + m)1/2ε-2) in the finite-sum case and 𝒪(ε-3) in the online case. Such a complexity is known to be the best one among IFO complexity results for non-convex stochastic compositional optimization. Numerical experiments on risk-adverse portfolio management validate the superiority of SARAH-Compositional over a few rival algorithms.
Recommended Citation
W. Hu et al., "Efficient Smooth Non-Convex Stochastic Compositional Optimization via Stochastic Recursive Gradient Descent," Proceedings of Advances in Neural Information Processing Systems (2019, Vancouver, Canada), vol. 32, Neural Information Processing Systems Foundation, Dec 2019.
Meeting Name
33rd Conference on Neural Information Processing Systems, NeurIPS 2019 (2019: Dec. 8-14, Vancouver, Canada)
Department(s)
Mathematics and Statistics
International Standard Serial Number (ISSN)
1049-5258
Document Type
Article - Conference proceedings
Document Version
Final Version
File Type
text
Language(s)
English
Rights
© 2019 The Authors, All rights reserved.
Publication Date
14 Dec 2019