Generalizing the theory of cooperative inference

Pei Wang, Pushpi Paranamana, Patrick Shafto
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:1841-1850, 2019.

Abstract

Cooperation information sharing is important to theories of human learning and has potential implications for machine learning. Prior work derived conditions for achieving optimal Cooperative Inference given strong, relatively restrictive assumptions. We relax these assumptions by demonstrating convergence for any discrete joint distribution, robustness through equivalence classes and stability under perturbation, and effectiveness by deriving bounds from structural properties of the original joint distribution. We provide geometric interpretations, connections to and implications for optimal transport, and connections to importance sampling, and conclude by outlining open questions and challenges to realizing the promise of Cooperative Inference.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-wang19c, title = {Generalizing the theory of cooperative inference}, author = {Wang, Pei and Paranamana, Pushpi and Shafto, Patrick}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {1841--1850}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/wang19c/wang19c.pdf}, url = {https://proceedings.mlr.press/v89/wang19c.html}, abstract = {Cooperation information sharing is important to theories of human learning and has potential implications for machine learning. Prior work derived conditions for achieving optimal Cooperative Inference given strong, relatively restrictive assumptions. We relax these assumptions by demonstrating convergence for any discrete joint distribution, robustness through equivalence classes and stability under perturbation, and effectiveness by deriving bounds from structural properties of the original joint distribution. We provide geometric interpretations, connections to and implications for optimal transport, and connections to importance sampling, and conclude by outlining open questions and challenges to realizing the promise of Cooperative Inference.} }
Endnote
%0 Conference Paper %T Generalizing the theory of cooperative inference %A Pei Wang %A Pushpi Paranamana %A Patrick Shafto %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-wang19c %I PMLR %P 1841--1850 %U https://proceedings.mlr.press/v89/wang19c.html %V 89 %X Cooperation information sharing is important to theories of human learning and has potential implications for machine learning. Prior work derived conditions for achieving optimal Cooperative Inference given strong, relatively restrictive assumptions. We relax these assumptions by demonstrating convergence for any discrete joint distribution, robustness through equivalence classes and stability under perturbation, and effectiveness by deriving bounds from structural properties of the original joint distribution. We provide geometric interpretations, connections to and implications for optimal transport, and connections to importance sampling, and conclude by outlining open questions and challenges to realizing the promise of Cooperative Inference.
APA
Wang, P., Paranamana, P. & Shafto, P.. (2019). Generalizing the theory of cooperative inference. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:1841-1850 Available from https://proceedings.mlr.press/v89/wang19c.html.

Related Material