TabLog: Test-Time Adaptation for Tabular Data Using Logic Rules

Weijieying Ren, Xiaoting Li, Huiyuan Chen, Vineeth Rakesh, Zhuoyi Wang, Mahashweta Das, Vasant G Honavar
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:42417-42427, 2024.

Abstract

We consider the problem of test-time adaptation of predictive models trained on tabular data. Effective solution of this problem requires adaptation of predictive models trained on the source domain to a target domain, using only unlabeled target domain data, without access to source domain data. Existing test-time adaptation methods for tabular data have difficulty coping with the heterogeneous features and their complex dependencies inherent in tabular data. To overcome these limitations, we consider test-time adaptation in the setting wherein the logical structure of the rules is assumed to remain invariant despite distribution shift between source and target domains whereas the numerical parameters associated with the rules and the weights assigned to them can vary to accommodate distribution shift. TabLog discretizes numerical features, models dependencies between heterogeneous features, introduces a novel contrastive loss for coping with distribution shift, and presents an end-to-end framework for efficient training and test-time adaptation by taking advantage of a logical neural network representation of a rule ensemble. We present results of experiments using several benchmark data sets that demonstrate TabLog is competitive with or improves upon the state-of-the-art methods for test-time adaptation of predictive models trained on tabular data. Our code is available at https://github.com/WeijieyingRen/TabLog.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-ren24b, title = {{T}ab{L}og: Test-Time Adaptation for Tabular Data Using Logic Rules}, author = {Ren, Weijieying and Li, Xiaoting and Chen, Huiyuan and Rakesh, Vineeth and Wang, Zhuoyi and Das, Mahashweta and Honavar, Vasant G}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {42417--42427}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/ren24b/ren24b.pdf}, url = {https://proceedings.mlr.press/v235/ren24b.html}, abstract = {We consider the problem of test-time adaptation of predictive models trained on tabular data. Effective solution of this problem requires adaptation of predictive models trained on the source domain to a target domain, using only unlabeled target domain data, without access to source domain data. Existing test-time adaptation methods for tabular data have difficulty coping with the heterogeneous features and their complex dependencies inherent in tabular data. To overcome these limitations, we consider test-time adaptation in the setting wherein the logical structure of the rules is assumed to remain invariant despite distribution shift between source and target domains whereas the numerical parameters associated with the rules and the weights assigned to them can vary to accommodate distribution shift. TabLog discretizes numerical features, models dependencies between heterogeneous features, introduces a novel contrastive loss for coping with distribution shift, and presents an end-to-end framework for efficient training and test-time adaptation by taking advantage of a logical neural network representation of a rule ensemble. We present results of experiments using several benchmark data sets that demonstrate TabLog is competitive with or improves upon the state-of-the-art methods for test-time adaptation of predictive models trained on tabular data. Our code is available at https://github.com/WeijieyingRen/TabLog.} }
Endnote
%0 Conference Paper %T TabLog: Test-Time Adaptation for Tabular Data Using Logic Rules %A Weijieying Ren %A Xiaoting Li %A Huiyuan Chen %A Vineeth Rakesh %A Zhuoyi Wang %A Mahashweta Das %A Vasant G Honavar %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-ren24b %I PMLR %P 42417--42427 %U https://proceedings.mlr.press/v235/ren24b.html %V 235 %X We consider the problem of test-time adaptation of predictive models trained on tabular data. Effective solution of this problem requires adaptation of predictive models trained on the source domain to a target domain, using only unlabeled target domain data, without access to source domain data. Existing test-time adaptation methods for tabular data have difficulty coping with the heterogeneous features and their complex dependencies inherent in tabular data. To overcome these limitations, we consider test-time adaptation in the setting wherein the logical structure of the rules is assumed to remain invariant despite distribution shift between source and target domains whereas the numerical parameters associated with the rules and the weights assigned to them can vary to accommodate distribution shift. TabLog discretizes numerical features, models dependencies between heterogeneous features, introduces a novel contrastive loss for coping with distribution shift, and presents an end-to-end framework for efficient training and test-time adaptation by taking advantage of a logical neural network representation of a rule ensemble. We present results of experiments using several benchmark data sets that demonstrate TabLog is competitive with or improves upon the state-of-the-art methods for test-time adaptation of predictive models trained on tabular data. Our code is available at https://github.com/WeijieyingRen/TabLog.
APA
Ren, W., Li, X., Chen, H., Rakesh, V., Wang, Z., Das, M. & Honavar, V.G.. (2024). TabLog: Test-Time Adaptation for Tabular Data Using Logic Rules. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:42417-42427 Available from https://proceedings.mlr.press/v235/ren24b.html.

Related Material