Protocols for Learning Classifiers on Distributed Data

Hal Daume III, Jeff Phillips, Avishek Saha, Suresh Venkatasubramanian
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:282-290, 2012.

Abstract

We consider the problem of learning classifiers for labeled data that has been distributed across several nodes. Our goal is to find a single classifier, with small approximation error, across all datasets while minimizing the communication between nodes. This setting models real-world communication bottlenecks in the processing of massive distributed datasets. We present several very general sampling-based solutions as well as some two-way protocols which have a provable exponential speed-up over any one-way protocol. We focus on core problems for noiseless data distributed across two or more nodes. The techniques we introduce are reminiscent of active learning, but rather than actively probing labels, nodes actively communicate with each other, each node simultaneously learning the important data from another node.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-daume12, title = {Protocols for Learning Classifiers on Distributed Data}, author = {III, Hal Daume and Phillips, Jeff and Saha, Avishek and Venkatasubramanian, Suresh}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {282--290}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/daume12/daume12.pdf}, url = {https://proceedings.mlr.press/v22/daume12.html}, abstract = {We consider the problem of learning classifiers for labeled data that has been distributed across several nodes. Our goal is to find a single classifier, with small approximation error, across all datasets while minimizing the communication between nodes. This setting models real-world communication bottlenecks in the processing of massive distributed datasets. We present several very general sampling-based solutions as well as some two-way protocols which have a provable exponential speed-up over any one-way protocol. We focus on core problems for noiseless data distributed across two or more nodes. The techniques we introduce are reminiscent of active learning, but rather than actively probing labels, nodes actively communicate with each other, each node simultaneously learning the important data from another node.} }
Endnote
%0 Conference Paper %T Protocols for Learning Classifiers on Distributed Data %A Hal Daume III %A Jeff Phillips %A Avishek Saha %A Suresh Venkatasubramanian %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-daume12 %I PMLR %P 282--290 %U https://proceedings.mlr.press/v22/daume12.html %V 22 %X We consider the problem of learning classifiers for labeled data that has been distributed across several nodes. Our goal is to find a single classifier, with small approximation error, across all datasets while minimizing the communication between nodes. This setting models real-world communication bottlenecks in the processing of massive distributed datasets. We present several very general sampling-based solutions as well as some two-way protocols which have a provable exponential speed-up over any one-way protocol. We focus on core problems for noiseless data distributed across two or more nodes. The techniques we introduce are reminiscent of active learning, but rather than actively probing labels, nodes actively communicate with each other, each node simultaneously learning the important data from another node.
RIS
TY - CPAPER TI - Protocols for Learning Classifiers on Distributed Data AU - Hal Daume III AU - Jeff Phillips AU - Avishek Saha AU - Suresh Venkatasubramanian BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-daume12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 282 EP - 290 L1 - http://proceedings.mlr.press/v22/daume12/daume12.pdf UR - https://proceedings.mlr.press/v22/daume12.html AB - We consider the problem of learning classifiers for labeled data that has been distributed across several nodes. Our goal is to find a single classifier, with small approximation error, across all datasets while minimizing the communication between nodes. This setting models real-world communication bottlenecks in the processing of massive distributed datasets. We present several very general sampling-based solutions as well as some two-way protocols which have a provable exponential speed-up over any one-way protocol. We focus on core problems for noiseless data distributed across two or more nodes. The techniques we introduce are reminiscent of active learning, but rather than actively probing labels, nodes actively communicate with each other, each node simultaneously learning the important data from another node. ER -
APA
III, H.D., Phillips, J., Saha, A. & Venkatasubramanian, S.. (2012). Protocols for Learning Classifiers on Distributed Data. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:282-290 Available from https://proceedings.mlr.press/v22/daume12.html.

Related Material