Open Problem: Online Local Learning

Paul Christiano
Proceedings of The 27th Conference on Learning Theory, PMLR 35:1290-1294, 2014.

Abstract

In many learning problems, we attempt to infer \emphglobal structure in the interest of making \emphlocal predictions. For example, we might try to infer the skills of the competitors in a tournament in order to predict who will win a match, or we might try to predict characteristics of users and films in order to predict which users will like which films. In even relatively simple settings of this type, it is typically NP-hard to find the latent data which best explain some observations. But do these complexity-theoretic obstructions actually prevent us from making good predictions? Because each prediction depends on only a small number of variables, it might be possible to make good predictions without actually finding a good global assignment. This may seem to be a purely technical distinction, but recent work has shown that several local prediction problems actually \emphare easy even though the corresponding global inference problem is hard. The question we pose is: how general is this phenomenon?

Cite this Paper


BibTeX
@InProceedings{pmlr-v35-christiano14, title = {Open Problem: Online Local Learning}, author = {Christiano, Paul}, booktitle = {Proceedings of The 27th Conference on Learning Theory}, pages = {1290--1294}, year = {2014}, editor = {Balcan, Maria Florina and Feldman, Vitaly and Szepesvári, Csaba}, volume = {35}, series = {Proceedings of Machine Learning Research}, address = {Barcelona, Spain}, month = {13--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v35/christiano14.pdf}, url = {https://proceedings.mlr.press/v35/christiano14.html}, abstract = {In many learning problems, we attempt to infer \emphglobal structure in the interest of making \emphlocal predictions. For example, we might try to infer the skills of the competitors in a tournament in order to predict who will win a match, or we might try to predict characteristics of users and films in order to predict which users will like which films. In even relatively simple settings of this type, it is typically NP-hard to find the latent data which best explain some observations. But do these complexity-theoretic obstructions actually prevent us from making good predictions? Because each prediction depends on only a small number of variables, it might be possible to make good predictions without actually finding a good global assignment. This may seem to be a purely technical distinction, but recent work has shown that several local prediction problems actually \emphare easy even though the corresponding global inference problem is hard. The question we pose is: how general is this phenomenon?} }
Endnote
%0 Conference Paper %T Open Problem: Online Local Learning %A Paul Christiano %B Proceedings of The 27th Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2014 %E Maria Florina Balcan %E Vitaly Feldman %E Csaba Szepesvári %F pmlr-v35-christiano14 %I PMLR %P 1290--1294 %U https://proceedings.mlr.press/v35/christiano14.html %V 35 %X In many learning problems, we attempt to infer \emphglobal structure in the interest of making \emphlocal predictions. For example, we might try to infer the skills of the competitors in a tournament in order to predict who will win a match, or we might try to predict characteristics of users and films in order to predict which users will like which films. In even relatively simple settings of this type, it is typically NP-hard to find the latent data which best explain some observations. But do these complexity-theoretic obstructions actually prevent us from making good predictions? Because each prediction depends on only a small number of variables, it might be possible to make good predictions without actually finding a good global assignment. This may seem to be a purely technical distinction, but recent work has shown that several local prediction problems actually \emphare easy even though the corresponding global inference problem is hard. The question we pose is: how general is this phenomenon?
RIS
TY - CPAPER TI - Open Problem: Online Local Learning AU - Paul Christiano BT - Proceedings of The 27th Conference on Learning Theory DA - 2014/05/29 ED - Maria Florina Balcan ED - Vitaly Feldman ED - Csaba Szepesvári ID - pmlr-v35-christiano14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 35 SP - 1290 EP - 1294 L1 - http://proceedings.mlr.press/v35/christiano14.pdf UR - https://proceedings.mlr.press/v35/christiano14.html AB - In many learning problems, we attempt to infer \emphglobal structure in the interest of making \emphlocal predictions. For example, we might try to infer the skills of the competitors in a tournament in order to predict who will win a match, or we might try to predict characteristics of users and films in order to predict which users will like which films. In even relatively simple settings of this type, it is typically NP-hard to find the latent data which best explain some observations. But do these complexity-theoretic obstructions actually prevent us from making good predictions? Because each prediction depends on only a small number of variables, it might be possible to make good predictions without actually finding a good global assignment. This may seem to be a purely technical distinction, but recent work has shown that several local prediction problems actually \emphare easy even though the corresponding global inference problem is hard. The question we pose is: how general is this phenomenon? ER -
APA
Christiano, P.. (2014). Open Problem: Online Local Learning. Proceedings of The 27th Conference on Learning Theory, in Proceedings of Machine Learning Research 35:1290-1294 Available from https://proceedings.mlr.press/v35/christiano14.html.

Related Material