Stochastic Variance-Reduced Cubic Regularized Newton Methods
Publication
, Conference
Zhou, D; Xu, P; Gu, Q
Published in: Proceedings of Machine Learning Research
January 1, 2018
We propose a stochastic variance-reduced cubic regularized Newton method (SVRC) for nonconvex optimization. At the core of our algorithm is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. We show that our algorithm is guaranteed to converge to an (✏,p✏)-approximate local minimum withineO(n4/5 /✏3/2 ) second-order oracle calls, which outperforms the state-of-the-art cubic regularization algorithms including subsampled cubic regularization. Our work also sheds light on the application of variance reduction technique to high-order non-convex optimization methods. Thorough experiments on various non-convex optimization problems support our theory.
Duke Scholars
Published In
Proceedings of Machine Learning Research
EISSN
2640-3498
Publication Date
January 1, 2018
Volume
80
Start / End Page
5990 / 5999
Citation
APA
Chicago
ICMJE
MLA
NLM
Zhou, D., Xu, P., & Gu, Q. (2018). Stochastic Variance-Reduced Cubic Regularized Newton Methods. In Proceedings of Machine Learning Research (Vol. 80, pp. 5990–5999).
Zhou, D., P. Xu, and Q. Gu. “Stochastic Variance-Reduced Cubic Regularized Newton Methods.” In Proceedings of Machine Learning Research, 80:5990–99, 2018.
Zhou D, Xu P, Gu Q. Stochastic Variance-Reduced Cubic Regularized Newton Methods. In: Proceedings of Machine Learning Research. 2018. p. 5990–9.
Zhou, D., et al. “Stochastic Variance-Reduced Cubic Regularized Newton Methods.” Proceedings of Machine Learning Research, vol. 80, 2018, pp. 5990–99.
Zhou D, Xu P, Gu Q. Stochastic Variance-Reduced Cubic Regularized Newton Methods. Proceedings of Machine Learning Research. 2018. p. 5990–5999.
Published In
Proceedings of Machine Learning Research
EISSN
2640-3498
Publication Date
January 1, 2018
Volume
80
Start / End Page
5990 / 5999