[edit]
Online Learning of Combinatorial Objects via Extended Formulation
Proceedings of Algorithmic Learning Theory, PMLR 83:702-724, 2018.
Abstract
The standard techniques for online learning of combinatorial objects
perform multiplicative updates followed by projections into the convex hull of all the objects.
However, this methodology can be expensive if the convex hull contains many facets.
For example, the convex hull of $n$-symbol Huffman trees is known to have exponentially many facets.
We get around this difficulty by exploiting extended formulations, which encode the polytope of combinatorial objects in a higher dimensional “extended” space with only polynomially many facets. We develop a general framework for converting extended formulations into efficient online algorithms with good relative loss bounds. We present applications of our framework to online learning of Huffman trees and permutations.
The regret bounds of the resulting algorithms are within a factor of $\Ocal(\sqrt{\log(n)})$
of the state-of-the-art specialized algorithms for permutations,
and depending on the loss regimes, improve on or match the state-of-the-art for Huffman trees.
Our method is general and can be applied to other combinatorial objects.