[edit]
Open problem: Direct Sums in Learning Theory
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:5325-5329, 2024.
Abstract
In computer science, the term ’direct sum’ refers to fundamental questions about the scaling of computational or information complexity with respect to multiple task instances. Consider an algorithmic task \({T} \){and} a computational resource \({C} \). For instance, \({T} \){might} be the task of computing a polynomial, with \({C} \){representing} the number of arithmetic operations required, or \({T} \){could} be a learning task with its sample complexity as \({C} \). The direct sum inquiry focuses on the cost of solving \({k} \){separate} instances of \({T} \), particularly how this aggregate cost compares to the resources needed for a single instance. Typically, the cost for multiple instances is at most \({k} \){times} the cost of one, since each can be handled independently. However, there are intriguing scenarios where the total cost for \({k} \){instances} is less than this linear relationship. Such questions naturally extend to the machine-learning setting in which one may be interested in solving several learning problems at once. This notion of direct sums of learning problems gives rise to various natural questions and interesting problems