AdaXpert: Adapting Neural Architecture for Growing Data

Shuaicheng Niu, Jiaxiang Wu, Guanghui Xu, Yifan Zhang, Yong Guo, Peilin Zhao, Peng Wang, Mingkui Tan
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:8184-8194, 2021.

Abstract

In real-world applications, data often come in a growing manner, where the data volume and the number of classes may increase dynamically. This will bring a critical challenge for learning: given the increasing data volume or the number of classes, one has to instantaneously adjust the neural model capacity to obtain promising performance. Existing methods either ignore the growing nature of data or seek to independently search an optimal architecture for a given dataset, and thus are incapable of promptly adjusting the architectures for the changed data. To address this, we present a neural architecture adaptation method, namely Adaptation eXpert (AdaXpert), to efficiently adjust previous architectures on the growing data. Specifically, we introduce an architecture adjuster to generate a suitable architecture for each data snapshot, based on the previous architecture and the different extent between current and previous data distributions. Furthermore, we propose an adaptation condition to determine the necessity of adjustment, thereby avoiding unnecessary and time-consuming adjustments. Extensive experiments on two growth scenarios (increasing data volume and number of classes) demonstrate the effectiveness of the proposed method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-niu21a, title = {AdaXpert: Adapting Neural Architecture for Growing Data}, author = {Niu, Shuaicheng and Wu, Jiaxiang and Xu, Guanghui and Zhang, Yifan and Guo, Yong and Zhao, Peilin and Wang, Peng and Tan, Mingkui}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {8184--8194}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/niu21a/niu21a.pdf}, url = {https://proceedings.mlr.press/v139/niu21a.html}, abstract = {In real-world applications, data often come in a growing manner, where the data volume and the number of classes may increase dynamically. This will bring a critical challenge for learning: given the increasing data volume or the number of classes, one has to instantaneously adjust the neural model capacity to obtain promising performance. Existing methods either ignore the growing nature of data or seek to independently search an optimal architecture for a given dataset, and thus are incapable of promptly adjusting the architectures for the changed data. To address this, we present a neural architecture adaptation method, namely Adaptation eXpert (AdaXpert), to efficiently adjust previous architectures on the growing data. Specifically, we introduce an architecture adjuster to generate a suitable architecture for each data snapshot, based on the previous architecture and the different extent between current and previous data distributions. Furthermore, we propose an adaptation condition to determine the necessity of adjustment, thereby avoiding unnecessary and time-consuming adjustments. Extensive experiments on two growth scenarios (increasing data volume and number of classes) demonstrate the effectiveness of the proposed method.} }
Endnote
%0 Conference Paper %T AdaXpert: Adapting Neural Architecture for Growing Data %A Shuaicheng Niu %A Jiaxiang Wu %A Guanghui Xu %A Yifan Zhang %A Yong Guo %A Peilin Zhao %A Peng Wang %A Mingkui Tan %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-niu21a %I PMLR %P 8184--8194 %U https://proceedings.mlr.press/v139/niu21a.html %V 139 %X In real-world applications, data often come in a growing manner, where the data volume and the number of classes may increase dynamically. This will bring a critical challenge for learning: given the increasing data volume or the number of classes, one has to instantaneously adjust the neural model capacity to obtain promising performance. Existing methods either ignore the growing nature of data or seek to independently search an optimal architecture for a given dataset, and thus are incapable of promptly adjusting the architectures for the changed data. To address this, we present a neural architecture adaptation method, namely Adaptation eXpert (AdaXpert), to efficiently adjust previous architectures on the growing data. Specifically, we introduce an architecture adjuster to generate a suitable architecture for each data snapshot, based on the previous architecture and the different extent between current and previous data distributions. Furthermore, we propose an adaptation condition to determine the necessity of adjustment, thereby avoiding unnecessary and time-consuming adjustments. Extensive experiments on two growth scenarios (increasing data volume and number of classes) demonstrate the effectiveness of the proposed method.
APA
Niu, S., Wu, J., Xu, G., Zhang, Y., Guo, Y., Zhao, P., Wang, P. & Tan, M.. (2021). AdaXpert: Adapting Neural Architecture for Growing Data. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:8184-8194 Available from https://proceedings.mlr.press/v139/niu21a.html.

Related Material