Online Clustering of Bandits

[edit]

Claudio Gentile, Shuai Li, Giovanni Zappella ;
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):757-765, 2014.

Abstract

We introduce a novel algorithmic approach to content recommendation based on adaptive clustering of exploration-exploitation (“bandit") strategies. We provide a sharp regret analysis of this algorithm in a standard stochastic noise setting, demonstrate its scalability properties, and prove its effectiveness on a number of artificial and real-world datasets. Our experiments show a significant increase in prediction performance over state-of-the-art methods for bandit problems.

Related Material