Stochastic variance-reduced cubic regularization methods
We propose a stochastic variance-reduced cubic regularized Newton method (SVRC) for non-convex optimization. At the core of SVRC is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. For a nonconvex function with n component functions, we show that our algorithm is guaranteed to converge to an (ϵ; √ ϵ)-approximate local minimum within O (n4/5=ϵ3/2)1 second-order oracle calls, which outperforms the state-of-the-art cubic regularization algo- rithms including subsampled cubic regularization. To further reduce the sample complexity of Hessian matrix computation in cubic regularization based methods, we also propose a sample efficient stochastic variance-reduced cubic regularization (Lite-SVRC) algorithm for finding the local minimum more efficiently. Lite-SVRC converges to an (ϵ; √ϵ)-approximate local minimum within O (n + n2/3=ϵ3/2) Hessian sample complexity, which is faster than all existing cubic regularization based methods. Numerical experiments with different non- convex optimization problems conducted on real datasets validate our theoretical results for both SVRC and Lite-SVRC.
Duke Scholars
Published In
EISSN
ISSN
Publication Date
Volume
Related Subject Headings
- Artificial Intelligence & Image Processing
- 4905 Statistics
- 4611 Machine learning
- 17 Psychology and Cognitive Sciences
- 08 Information and Computing Sciences
Citation
Published In
EISSN
ISSN
Publication Date
Volume
Related Subject Headings
- Artificial Intelligence & Image Processing
- 4905 Statistics
- 4611 Machine learning
- 17 Psychology and Cognitive Sciences
- 08 Information and Computing Sciences