Reducing exploration of dying arms in mortal bandits

Published

Conference Paper

© 2019 Air and Waste Management Association. All rights reserved. Mortal bandits have proven to be extremely useful for providing news article recommendations, running automated online advertising campaigns, and for other applications where the set of available options changes over time. Previous work on this problem showed how to regulate exploration of new arms when they have recently appeared, but they do not adapt when the arms are about to disappear. Since in most applications we can determine either exactly or approximately when arms will disappear, we can leverage this information to improve performance: we should not be exploring arms that are about to disappear. We provide adaptations of algorithms, regret bounds, and experiments for this study, showing a clear benefit from regulating greed (exploration/exploitation) for arms that will soon disappear. We illustrate numerical performance on the Yahoo! Front Page Today Module User Click Log Dataset.

Duke Authors

Cited Authors

  • Tracà, S; Rudin, C; Yan, W

Published Date

  • January 1, 2019

Published In

  • 35th Conference on Uncertainty in Artificial Intelligence, Uai 2019

Citation Source

  • Scopus