Task Shift: From Classification to Regression in Overparameterized Linear Models

Tyler LaBonte, Kuo-Wei Lai, Vidya Muthukumar
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:3934-3942, 2025.

Abstract

Modern machine learning methods have recently demonstrated remarkable capability to generalize under task shift, where latent knowledge is transferred to a different, often more difficult, task under a similar data distribution. We investigate this phenomenon in an overparameterized linear regression setting where the task shifts from classification during training to regression during evaluation. In the zero-shot case, wherein no regression data is available, we prove that task shift is impossible in both sparse signal and random signal models for any Gaussian covariate distribution. In the few-shot case, wherein limited regression data is available, we propose a simple postprocessing algorithm which asymptotically recovers the ground-truth predictor. Our analysis leverages a fine-grained characterization of individual parameters arising from minimum-norm interpolation which may be of independent interest. Our results show that while minimum-norm interpolators for classification cannot transfer to regression a priori, they experience surprisingly structured attenuation which enables successful task shift with limited additional data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-labonte25a, title = {Task Shift: From Classification to Regression in Overparameterized Linear Models}, author = {LaBonte, Tyler and Lai, Kuo-Wei and Muthukumar, Vidya}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {3934--3942}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/labonte25a/labonte25a.pdf}, url = {https://proceedings.mlr.press/v258/labonte25a.html}, abstract = {Modern machine learning methods have recently demonstrated remarkable capability to generalize under task shift, where latent knowledge is transferred to a different, often more difficult, task under a similar data distribution. We investigate this phenomenon in an overparameterized linear regression setting where the task shifts from classification during training to regression during evaluation. In the zero-shot case, wherein no regression data is available, we prove that task shift is impossible in both sparse signal and random signal models for any Gaussian covariate distribution. In the few-shot case, wherein limited regression data is available, we propose a simple postprocessing algorithm which asymptotically recovers the ground-truth predictor. Our analysis leverages a fine-grained characterization of individual parameters arising from minimum-norm interpolation which may be of independent interest. Our results show that while minimum-norm interpolators for classification cannot transfer to regression a priori, they experience surprisingly structured attenuation which enables successful task shift with limited additional data.} }
Endnote
%0 Conference Paper %T Task Shift: From Classification to Regression in Overparameterized Linear Models %A Tyler LaBonte %A Kuo-Wei Lai %A Vidya Muthukumar %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-labonte25a %I PMLR %P 3934--3942 %U https://proceedings.mlr.press/v258/labonte25a.html %V 258 %X Modern machine learning methods have recently demonstrated remarkable capability to generalize under task shift, where latent knowledge is transferred to a different, often more difficult, task under a similar data distribution. We investigate this phenomenon in an overparameterized linear regression setting where the task shifts from classification during training to regression during evaluation. In the zero-shot case, wherein no regression data is available, we prove that task shift is impossible in both sparse signal and random signal models for any Gaussian covariate distribution. In the few-shot case, wherein limited regression data is available, we propose a simple postprocessing algorithm which asymptotically recovers the ground-truth predictor. Our analysis leverages a fine-grained characterization of individual parameters arising from minimum-norm interpolation which may be of independent interest. Our results show that while minimum-norm interpolators for classification cannot transfer to regression a priori, they experience surprisingly structured attenuation which enables successful task shift with limited additional data.
APA
LaBonte, T., Lai, K. & Muthukumar, V.. (2025). Task Shift: From Classification to Regression in Overparameterized Linear Models. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:3934-3942 Available from https://proceedings.mlr.press/v258/labonte25a.html.

Related Material