Open problem: Direct Sums in Learning Theory

Steve Hanneke, Shay Moran, Waknine Tom
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:5325-5329, 2024.

Abstract

In computer science, the term ’direct sum’ refers to fundamental questions about the scaling of computational or information complexity with respect to multiple task instances. Consider an algorithmic task \({T} \){and} a computational resource \({C} \). For instance, \({T} \){might} be the task of computing a polynomial, with \({C} \){representing} the number of arithmetic operations required, or \({T} \){could} be a learning task with its sample complexity as \({C} \). The direct sum inquiry focuses on the cost of solving \({k} \){separate} instances of \({T} \), particularly how this aggregate cost compares to the resources needed for a single instance. Typically, the cost for multiple instances is at most \({k} \){times} the cost of one, since each can be handled independently. However, there are intriguing scenarios where the total cost for \({k} \){instances} is less than this linear relationship. Such questions naturally extend to the machine-learning setting in which one may be interested in solving several learning problems at once. This notion of direct sums of learning problems gives rise to various natural questions and interesting problems

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-hanneke24c, title = {Open problem: Direct Sums in Learning Theory}, author = {Hanneke, Steve and Moran, Shay and Tom, Waknine}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {5325--5329}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/hanneke24c/hanneke24c.pdf}, url = {https://proceedings.mlr.press/v247/hanneke24c.html}, abstract = {In computer science, the term ’direct sum’ refers to fundamental questions about the scaling of computational or information complexity with respect to multiple task instances. Consider an algorithmic task \({T} \){and} a computational resource \({C} \). For instance, \({T} \){might} be the task of computing a polynomial, with \({C} \){representing} the number of arithmetic operations required, or \({T} \){could} be a learning task with its sample complexity as \({C} \). The direct sum inquiry focuses on the cost of solving \({k} \){separate} instances of \({T} \), particularly how this aggregate cost compares to the resources needed for a single instance. Typically, the cost for multiple instances is at most \({k} \){times} the cost of one, since each can be handled independently. However, there are intriguing scenarios where the total cost for \({k} \){instances} is less than this linear relationship. Such questions naturally extend to the machine-learning setting in which one may be interested in solving several learning problems at once. This notion of direct sums of learning problems gives rise to various natural questions and interesting problems} }
Endnote
%0 Conference Paper %T Open problem: Direct Sums in Learning Theory %A Steve Hanneke %A Shay Moran %A Waknine Tom %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-hanneke24c %I PMLR %P 5325--5329 %U https://proceedings.mlr.press/v247/hanneke24c.html %V 247 %X In computer science, the term ’direct sum’ refers to fundamental questions about the scaling of computational or information complexity with respect to multiple task instances. Consider an algorithmic task \({T} \){and} a computational resource \({C} \). For instance, \({T} \){might} be the task of computing a polynomial, with \({C} \){representing} the number of arithmetic operations required, or \({T} \){could} be a learning task with its sample complexity as \({C} \). The direct sum inquiry focuses on the cost of solving \({k} \){separate} instances of \({T} \), particularly how this aggregate cost compares to the resources needed for a single instance. Typically, the cost for multiple instances is at most \({k} \){times} the cost of one, since each can be handled independently. However, there are intriguing scenarios where the total cost for \({k} \){instances} is less than this linear relationship. Such questions naturally extend to the machine-learning setting in which one may be interested in solving several learning problems at once. This notion of direct sums of learning problems gives rise to various natural questions and interesting problems
APA
Hanneke, S., Moran, S. & Tom, W.. (2024). Open problem: Direct Sums in Learning Theory. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:5325-5329 Available from https://proceedings.mlr.press/v247/hanneke24c.html.

Related Material