Open Problem: Do Good Algorithms Necessarily Query Bad Points?
Publication
, Conference
Ge, R; Jain, P; Kakade, SM; Kidambi, R; Nagaraj, DM; Netrapalli, P
Published in: Proceedings of Machine Learning Research
January 1, 2019
Folklore results in the theory of Stochastic Approximation indicates the (minimax) optimality of Stochastic Gradient Descent (SGD) (Robbins and Monro, 1951) with polynomially decaying step sizes and iterate averaging (Ruppert, 1988; Polyak and Juditsky, 1992) for classes of stochastic convex optimization. Basing of these folkore results and some recent developments, this manuscript considers a more subtle question: does any algorithm necessarily (information theoretically) have to query iterates that are sub-optimal infinitely often?
Duke Scholars
Published In
Proceedings of Machine Learning Research
EISSN
2640-3498
Publication Date
January 1, 2019
Volume
99
Start / End Page
3190 / 3193
Citation
APA
Chicago
ICMJE
MLA
NLM
Ge, R., Jain, P., Kakade, S. M., Kidambi, R., Nagaraj, D. M., & Netrapalli, P. (2019). Open Problem: Do Good Algorithms Necessarily Query Bad Points? In Proceedings of Machine Learning Research (Vol. 99, pp. 3190–3193).
Ge, R., P. Jain, S. M. Kakade, R. Kidambi, D. M. Nagaraj, and P. Netrapalli. “Open Problem: Do Good Algorithms Necessarily Query Bad Points?” In Proceedings of Machine Learning Research, 99:3190–93, 2019.
Ge R, Jain P, Kakade SM, Kidambi R, Nagaraj DM, Netrapalli P. Open Problem: Do Good Algorithms Necessarily Query Bad Points? In: Proceedings of Machine Learning Research. 2019. p. 3190–3.
Ge, R., et al. “Open Problem: Do Good Algorithms Necessarily Query Bad Points?” Proceedings of Machine Learning Research, vol. 99, 2019, pp. 3190–93.
Ge R, Jain P, Kakade SM, Kidambi R, Nagaraj DM, Netrapalli P. Open Problem: Do Good Algorithms Necessarily Query Bad Points? Proceedings of Machine Learning Research. 2019. p. 3190–3193.
Published In
Proceedings of Machine Learning Research
EISSN
2640-3498
Publication Date
January 1, 2019
Volume
99
Start / End Page
3190 / 3193