[edit]
Open Problem: Online Local Learning
Proceedings of The 27th Conference on Learning Theory, PMLR 35:1290-1294, 2014.
Abstract
In many learning problems, we attempt to infer \emphglobal structure in the interest of making \emphlocal predictions. For example, we might try to infer the skills of the competitors in a tournament in order to predict who will win a match, or we might try to predict characteristics of users and films in order to predict which users will like which films. In even relatively simple settings of this type, it is typically NP-hard to find the latent data which best explain some observations. But do these complexity-theoretic obstructions actually prevent us from making good predictions? Because each prediction depends on only a small number of variables, it might be possible to make good predictions without actually finding a good global assignment. This may seem to be a purely technical distinction, but recent work has shown that several local prediction problems actually \emphare easy even though the corresponding global inference problem is hard. The question we pose is: how general is this phenomenon?