One-Pass Feature Evolvable Learning with Theoretical Guarantees

Cun-Yuan Xing, Meng-Zhang Qian, Wu-Yang Chen, Wei Gao, Zhi-Hua Zhou
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:68928-68952, 2025.

Abstract

Feature evolvable learning studies the scenario where old features will vanish and new features will emerge when learning with data streams, and various methods have been developed by utilizing some useful relationships from old features to new features, rather than re-training from scratch. In this work, we focus on two fundamental problems: How to characterize the relationships between two different feature spaces, and how to exploit those relationships for feature evolvable learning. We introduce the Kernel Ortho-Mapping (KOM) discrepancy to characterize relationships between two different feature spaces via kernel functions, and correlate with the optimal classifiers learned from different feature spaces. Based on this discrepancy, we develop the one-pass algorithm for feature evolvable learning, which requires going through all instances only once without storing the entire or partial training data. Our basic idea is to take online kernel learning with the random Fourier features and incorporate some feature and label relationships via the KOM discrepancy for feature evolvable learning. We finally validate the effectiveness of our proposed method both theoretically and empirically.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-xing25c, title = {One-Pass Feature Evolvable Learning with Theoretical Guarantees}, author = {Xing, Cun-Yuan and Qian, Meng-Zhang and Chen, Wu-Yang and Gao, Wei and Zhou, Zhi-Hua}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {68928--68952}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/xing25c/xing25c.pdf}, url = {https://proceedings.mlr.press/v267/xing25c.html}, abstract = {Feature evolvable learning studies the scenario where old features will vanish and new features will emerge when learning with data streams, and various methods have been developed by utilizing some useful relationships from old features to new features, rather than re-training from scratch. In this work, we focus on two fundamental problems: How to characterize the relationships between two different feature spaces, and how to exploit those relationships for feature evolvable learning. We introduce the Kernel Ortho-Mapping (KOM) discrepancy to characterize relationships between two different feature spaces via kernel functions, and correlate with the optimal classifiers learned from different feature spaces. Based on this discrepancy, we develop the one-pass algorithm for feature evolvable learning, which requires going through all instances only once without storing the entire or partial training data. Our basic idea is to take online kernel learning with the random Fourier features and incorporate some feature and label relationships via the KOM discrepancy for feature evolvable learning. We finally validate the effectiveness of our proposed method both theoretically and empirically.} }
Endnote
%0 Conference Paper %T One-Pass Feature Evolvable Learning with Theoretical Guarantees %A Cun-Yuan Xing %A Meng-Zhang Qian %A Wu-Yang Chen %A Wei Gao %A Zhi-Hua Zhou %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-xing25c %I PMLR %P 68928--68952 %U https://proceedings.mlr.press/v267/xing25c.html %V 267 %X Feature evolvable learning studies the scenario where old features will vanish and new features will emerge when learning with data streams, and various methods have been developed by utilizing some useful relationships from old features to new features, rather than re-training from scratch. In this work, we focus on two fundamental problems: How to characterize the relationships between two different feature spaces, and how to exploit those relationships for feature evolvable learning. We introduce the Kernel Ortho-Mapping (KOM) discrepancy to characterize relationships between two different feature spaces via kernel functions, and correlate with the optimal classifiers learned from different feature spaces. Based on this discrepancy, we develop the one-pass algorithm for feature evolvable learning, which requires going through all instances only once without storing the entire or partial training data. Our basic idea is to take online kernel learning with the random Fourier features and incorporate some feature and label relationships via the KOM discrepancy for feature evolvable learning. We finally validate the effectiveness of our proposed method both theoretically and empirically.
APA
Xing, C., Qian, M., Chen, W., Gao, W. & Zhou, Z.. (2025). One-Pass Feature Evolvable Learning with Theoretical Guarantees. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:68928-68952 Available from https://proceedings.mlr.press/v267/xing25c.html.

Related Material