Skip to content

Latest commit

 

History

History
52 lines (52 loc) · 2.04 KB

2024-06-30-hanneke24c.md

File metadata and controls

52 lines (52 loc) · 2.04 KB
title section abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Open problem: Direct Sums in Learning Theory
Open Problems
In computer science, the term ’direct sum’ refers to fundamental questions about the scaling of computational or information complexity with respect to multiple task instances. Consider an algorithmic task \({T} \){and} a computational resource \({C} \). For instance, \({T} \){might} be the task of computing a polynomial, with \({C} \){representing} the number of arithmetic operations required, or \({T} \){could} be a learning task with its sample complexity as \({C} \). The direct sum inquiry focuses on the cost of solving \({k} \){separate} instances of \({T} \), particularly how this aggregate cost compares to the resources needed for a single instance. Typically, the cost for multiple instances is at most \({k} \){times} the cost of one, since each can be handled independently. However, there are intriguing scenarios where the total cost for \({k} \){instances} is less than this linear relationship. Such questions naturally extend to the machine-learning setting in which one may be interested in solving several learning problems at once. This notion of direct sums of learning problems gives rise to various natural questions and interesting problems
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
hanneke24c
0
Open problem: Direct Sums in Learning Theory
5325
5329
5325-5329
5325
false
Hanneke, Steve and Moran, Shay and Tom, Waknine
given family
Steve
Hanneke
given family
Shay
Moran
given family
Waknine
Tom
2024-06-30
Proceedings of Thirty Seventh Conference on Learning Theory
247
inproceedings
date-parts
2024
6
30