Information Bottleneck-guided MLPs for Robust Spatial-temporal Forecasting

Min Chen, Guansong Pang, Wenjun Wang, Cheng Yan
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:8821-8855, 2025.

Abstract

Spatial-temporal forecasting (STF) plays a pivotal role in urban planning and computing. Spatial-Temporal Graph Neural Networks (STGNNs) excel at modeling spatial-temporal dynamics, thus being robust against noise perturbations. However, they often suffer from relatively poor computational efficiency. Simplifying the architectures can improve efficiency but also weakens robustness with respect to noise interference. In this study, we investigate the problem: can simple neural networks such as Multi-Layer Perceptrons (MLPs) achieve robust spatial-temporal forecasting while remaining efficient? To this end, we first reveal the dual noise effect in spatial-temporal data and propose a theoretically grounded principle termed Robust Spatial-Temporal Information Bottleneck (RSTIB), which holds strong potential for improving model robustness. We then design an implementation named RSTIB-MLP, together with a new training regime incorporating a knowledge distillation module, to enhance the robustness of MLPs for STF while maintaining their efficiency. Comprehensive experiments demonstrate that RSTIB-MLP achieves an excellent trade-off between robustness and efficiency, outperforming state-of-the-art STGNNs and MLP-based models. Our code is publicly available at: https://github.com/mchen644/RSTIB.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-chen25ax, title = {Information Bottleneck-guided {MLP}s for Robust Spatial-temporal Forecasting}, author = {Chen, Min and Pang, Guansong and Wang, Wenjun and Yan, Cheng}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {8821--8855}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/chen25ax/chen25ax.pdf}, url = {https://proceedings.mlr.press/v267/chen25ax.html}, abstract = {Spatial-temporal forecasting (STF) plays a pivotal role in urban planning and computing. Spatial-Temporal Graph Neural Networks (STGNNs) excel at modeling spatial-temporal dynamics, thus being robust against noise perturbations. However, they often suffer from relatively poor computational efficiency. Simplifying the architectures can improve efficiency but also weakens robustness with respect to noise interference. In this study, we investigate the problem: can simple neural networks such as Multi-Layer Perceptrons (MLPs) achieve robust spatial-temporal forecasting while remaining efficient? To this end, we first reveal the dual noise effect in spatial-temporal data and propose a theoretically grounded principle termed Robust Spatial-Temporal Information Bottleneck (RSTIB), which holds strong potential for improving model robustness. We then design an implementation named RSTIB-MLP, together with a new training regime incorporating a knowledge distillation module, to enhance the robustness of MLPs for STF while maintaining their efficiency. Comprehensive experiments demonstrate that RSTIB-MLP achieves an excellent trade-off between robustness and efficiency, outperforming state-of-the-art STGNNs and MLP-based models. Our code is publicly available at: https://github.com/mchen644/RSTIB.} }
Endnote
%0 Conference Paper %T Information Bottleneck-guided MLPs for Robust Spatial-temporal Forecasting %A Min Chen %A Guansong Pang %A Wenjun Wang %A Cheng Yan %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-chen25ax %I PMLR %P 8821--8855 %U https://proceedings.mlr.press/v267/chen25ax.html %V 267 %X Spatial-temporal forecasting (STF) plays a pivotal role in urban planning and computing. Spatial-Temporal Graph Neural Networks (STGNNs) excel at modeling spatial-temporal dynamics, thus being robust against noise perturbations. However, they often suffer from relatively poor computational efficiency. Simplifying the architectures can improve efficiency but also weakens robustness with respect to noise interference. In this study, we investigate the problem: can simple neural networks such as Multi-Layer Perceptrons (MLPs) achieve robust spatial-temporal forecasting while remaining efficient? To this end, we first reveal the dual noise effect in spatial-temporal data and propose a theoretically grounded principle termed Robust Spatial-Temporal Information Bottleneck (RSTIB), which holds strong potential for improving model robustness. We then design an implementation named RSTIB-MLP, together with a new training regime incorporating a knowledge distillation module, to enhance the robustness of MLPs for STF while maintaining their efficiency. Comprehensive experiments demonstrate that RSTIB-MLP achieves an excellent trade-off between robustness and efficiency, outperforming state-of-the-art STGNNs and MLP-based models. Our code is publicly available at: https://github.com/mchen644/RSTIB.
APA
Chen, M., Pang, G., Wang, W. & Yan, C.. (2025). Information Bottleneck-guided MLPs for Robust Spatial-temporal Forecasting. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:8821-8855 Available from https://proceedings.mlr.press/v267/chen25ax.html.

Related Material