Random Oscillators Network for Time Series Processing

Andrea Ceni, Andrea Cossu, Maximilian W Stölzle, Jingyue Liu, Cosimo Della Santina, Davide Bacciu, Claudio Gallicchio
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:4807-4815, 2024.

Abstract

We introduce the Random Oscillators Network (RON), a physically-inspired recurrent model derived from a network of heterogeneous oscillators. Unlike traditional recurrent neural networks, RON keeps the connections between oscillators untrained by leveraging on smart random initialisations, leading to exceptional computational efficiency. A rigorous theoretical analysis finds the necessary and sufficient conditions for the stability of RON, highlighting the natural tendency of RON to lie at the edge of stability, a regime of configurations offering particularly powerful and expressive models. Through an extensive empirical evaluation on several benchmarks, we show four main advantages of RON. 1) RON shows excellent long-term memory and sequence classification ability, outperforming other randomised approaches. 2) RON outperforms fully-trained recurrent models and state-of-the-art randomised models in chaotic time series forecasting. 3) RON provides expressive internal representations even in a small parametrisation regime making it amenable to be deployed on low-powered devices and at the edge. 4) RON is up to two orders of magnitude faster than fully-trained models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-ceni24a, title = { Random Oscillators Network for Time Series Processing }, author = {Ceni, Andrea and Cossu, Andrea and W St\"{o}lzle, Maximilian and Liu, Jingyue and Della Santina, Cosimo and Bacciu, Davide and Gallicchio, Claudio}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {4807--4815}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/ceni24a/ceni24a.pdf}, url = {https://proceedings.mlr.press/v238/ceni24a.html}, abstract = { We introduce the Random Oscillators Network (RON), a physically-inspired recurrent model derived from a network of heterogeneous oscillators. Unlike traditional recurrent neural networks, RON keeps the connections between oscillators untrained by leveraging on smart random initialisations, leading to exceptional computational efficiency. A rigorous theoretical analysis finds the necessary and sufficient conditions for the stability of RON, highlighting the natural tendency of RON to lie at the edge of stability, a regime of configurations offering particularly powerful and expressive models. Through an extensive empirical evaluation on several benchmarks, we show four main advantages of RON. 1) RON shows excellent long-term memory and sequence classification ability, outperforming other randomised approaches. 2) RON outperforms fully-trained recurrent models and state-of-the-art randomised models in chaotic time series forecasting. 3) RON provides expressive internal representations even in a small parametrisation regime making it amenable to be deployed on low-powered devices and at the edge. 4) RON is up to two orders of magnitude faster than fully-trained models. } }
Endnote
%0 Conference Paper %T Random Oscillators Network for Time Series Processing %A Andrea Ceni %A Andrea Cossu %A Maximilian W Stölzle %A Jingyue Liu %A Cosimo Della Santina %A Davide Bacciu %A Claudio Gallicchio %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-ceni24a %I PMLR %P 4807--4815 %U https://proceedings.mlr.press/v238/ceni24a.html %V 238 %X We introduce the Random Oscillators Network (RON), a physically-inspired recurrent model derived from a network of heterogeneous oscillators. Unlike traditional recurrent neural networks, RON keeps the connections between oscillators untrained by leveraging on smart random initialisations, leading to exceptional computational efficiency. A rigorous theoretical analysis finds the necessary and sufficient conditions for the stability of RON, highlighting the natural tendency of RON to lie at the edge of stability, a regime of configurations offering particularly powerful and expressive models. Through an extensive empirical evaluation on several benchmarks, we show four main advantages of RON. 1) RON shows excellent long-term memory and sequence classification ability, outperforming other randomised approaches. 2) RON outperforms fully-trained recurrent models and state-of-the-art randomised models in chaotic time series forecasting. 3) RON provides expressive internal representations even in a small parametrisation regime making it amenable to be deployed on low-powered devices and at the edge. 4) RON is up to two orders of magnitude faster than fully-trained models.
APA
Ceni, A., Cossu, A., W Stölzle, M., Liu, J., Della Santina, C., Bacciu, D. & Gallicchio, C.. (2024). Random Oscillators Network for Time Series Processing . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:4807-4815 Available from https://proceedings.mlr.press/v238/ceni24a.html.

Related Material