Autoregressive Convolutional Neural Networks for Asynchronous Time Series

Mikolaj Binkowski, Gautier Marti, Philippe Donnat
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:580-589, 2018.

Abstract

We propose Significance-Offset Convolutional Neural Network, a deep convolutional network architecture for regression of multivariate asynchronous time series. The model is inspired by standard autoregressive (AR) models and gating mechanisms used in recurrent neural networks. It involves an AR-like weighting system, where the final predictor is obtained as a weighted sum of adjusted regressors, while the weights are data-dependent functions learnt through a convolutional network. The architecture was designed for applications on asynchronous time series and is evaluated on such datasets: a hedge fund proprietary dataset of over 2 million quotes for a credit derivative index, an artificially generated noisy autoregressive series and UCI household electricity consumption dataset. The proposed architecture achieves promising results as compared to convolutional and recurrent neural networks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-binkowski18a, title = {Autoregressive Convolutional Neural Networks for Asynchronous Time Series}, author = {Binkowski, Mikolaj and Marti, Gautier and Donnat, Philippe}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {580--589}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/binkowski18a/binkowski18a.pdf}, url = {https://proceedings.mlr.press/v80/binkowski18a.html}, abstract = {We propose Significance-Offset Convolutional Neural Network, a deep convolutional network architecture for regression of multivariate asynchronous time series. The model is inspired by standard autoregressive (AR) models and gating mechanisms used in recurrent neural networks. It involves an AR-like weighting system, where the final predictor is obtained as a weighted sum of adjusted regressors, while the weights are data-dependent functions learnt through a convolutional network. The architecture was designed for applications on asynchronous time series and is evaluated on such datasets: a hedge fund proprietary dataset of over 2 million quotes for a credit derivative index, an artificially generated noisy autoregressive series and UCI household electricity consumption dataset. The proposed architecture achieves promising results as compared to convolutional and recurrent neural networks.} }
Endnote
%0 Conference Paper %T Autoregressive Convolutional Neural Networks for Asynchronous Time Series %A Mikolaj Binkowski %A Gautier Marti %A Philippe Donnat %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-binkowski18a %I PMLR %P 580--589 %U https://proceedings.mlr.press/v80/binkowski18a.html %V 80 %X We propose Significance-Offset Convolutional Neural Network, a deep convolutional network architecture for regression of multivariate asynchronous time series. The model is inspired by standard autoregressive (AR) models and gating mechanisms used in recurrent neural networks. It involves an AR-like weighting system, where the final predictor is obtained as a weighted sum of adjusted regressors, while the weights are data-dependent functions learnt through a convolutional network. The architecture was designed for applications on asynchronous time series and is evaluated on such datasets: a hedge fund proprietary dataset of over 2 million quotes for a credit derivative index, an artificially generated noisy autoregressive series and UCI household electricity consumption dataset. The proposed architecture achieves promising results as compared to convolutional and recurrent neural networks.
APA
Binkowski, M., Marti, G. & Donnat, P.. (2018). Autoregressive Convolutional Neural Networks for Asynchronous Time Series. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:580-589 Available from https://proceedings.mlr.press/v80/binkowski18a.html.

Related Material