Function Encoders: A Principled Approach to Transfer Learning in Hilbert Spaces

Tyler Ingebrand, Adam Thorpe, Ufuk Topcu
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:26464-26489, 2025.

Abstract

A central challenge in transfer learning is designing algorithms that can quickly adapt and generalize to new tasks without retraining. Yet, the conditions of when and how algorithms can effectively transfer to new tasks is poorly characterized. We introduce a geometric characterization of transfer in Hilbert spaces and define three types of inductive transfer: interpolation within the convex hull, extrapolation to the linear span, and extrapolation outside the span. We propose a method grounded in the theory of function encoders to achieve all three types of transfer. Specifically, we introduce a novel training scheme for function encoders using least-squares optimization, prove a universal approximation theorem for function encoders, and provide a comprehensive comparison with existing approaches such as transformers and meta-learning on four diverse benchmarks. Our experiments demonstrate that the function encoder outperforms state-of-the-art methods on four benchmark tasks and on all three types of transfer.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-ingebrand25a, title = {Function Encoders: A Principled Approach to Transfer Learning in {H}ilbert Spaces}, author = {Ingebrand, Tyler and Thorpe, Adam and Topcu, Ufuk}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {26464--26489}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/ingebrand25a/ingebrand25a.pdf}, url = {https://proceedings.mlr.press/v267/ingebrand25a.html}, abstract = {A central challenge in transfer learning is designing algorithms that can quickly adapt and generalize to new tasks without retraining. Yet, the conditions of when and how algorithms can effectively transfer to new tasks is poorly characterized. We introduce a geometric characterization of transfer in Hilbert spaces and define three types of inductive transfer: interpolation within the convex hull, extrapolation to the linear span, and extrapolation outside the span. We propose a method grounded in the theory of function encoders to achieve all three types of transfer. Specifically, we introduce a novel training scheme for function encoders using least-squares optimization, prove a universal approximation theorem for function encoders, and provide a comprehensive comparison with existing approaches such as transformers and meta-learning on four diverse benchmarks. Our experiments demonstrate that the function encoder outperforms state-of-the-art methods on four benchmark tasks and on all three types of transfer.} }
Endnote
%0 Conference Paper %T Function Encoders: A Principled Approach to Transfer Learning in Hilbert Spaces %A Tyler Ingebrand %A Adam Thorpe %A Ufuk Topcu %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-ingebrand25a %I PMLR %P 26464--26489 %U https://proceedings.mlr.press/v267/ingebrand25a.html %V 267 %X A central challenge in transfer learning is designing algorithms that can quickly adapt and generalize to new tasks without retraining. Yet, the conditions of when and how algorithms can effectively transfer to new tasks is poorly characterized. We introduce a geometric characterization of transfer in Hilbert spaces and define three types of inductive transfer: interpolation within the convex hull, extrapolation to the linear span, and extrapolation outside the span. We propose a method grounded in the theory of function encoders to achieve all three types of transfer. Specifically, we introduce a novel training scheme for function encoders using least-squares optimization, prove a universal approximation theorem for function encoders, and provide a comprehensive comparison with existing approaches such as transformers and meta-learning on four diverse benchmarks. Our experiments demonstrate that the function encoder outperforms state-of-the-art methods on four benchmark tasks and on all three types of transfer.
APA
Ingebrand, T., Thorpe, A. & Topcu, U.. (2025). Function Encoders: A Principled Approach to Transfer Learning in Hilbert Spaces. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:26464-26489 Available from https://proceedings.mlr.press/v267/ingebrand25a.html.

Related Material