Bit-Swap: Recursive Bits-Back Coding for Lossless Compression with Hierarchical Latent Variables

Friso Kingma, Pieter Abbeel, Jonathan Ho
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:3408-3417, 2019.

Abstract

The bits-back argument suggests that latent variable models can be turned into lossless compression schemes. Translating the bits-back argument into efficient and practical lossless compression schemes for general latent variable models, however, is still an open problem. Bits-Back with Asymmetric Numeral Systems (BB-ANS), recently proposed by Townsend et al,. 2019, makes bits-back coding practically feasible for latent variable models with one latent layer, but it is inefficient for hierarchical latent variable models. In this paper we propose Bit-Swap, a new compression scheme that generalizes BB-ANS and achieves strictly better compression rates for hierarchical latent variable models with Markov chain structure. Through experiments we verify that Bit-Swap results in lossless compression rates that are empirically superior to existing techniques.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-kingma19a, title = {Bit-Swap: Recursive Bits-Back Coding for Lossless Compression with Hierarchical Latent Variables}, author = {Kingma, Friso and Abbeel, Pieter and Ho, Jonathan}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {3408--3417}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/kingma19a/kingma19a.pdf}, url = {https://proceedings.mlr.press/v97/kingma19a.html}, abstract = {The bits-back argument suggests that latent variable models can be turned into lossless compression schemes. Translating the bits-back argument into efficient and practical lossless compression schemes for general latent variable models, however, is still an open problem. Bits-Back with Asymmetric Numeral Systems (BB-ANS), recently proposed by Townsend et al,. 2019, makes bits-back coding practically feasible for latent variable models with one latent layer, but it is inefficient for hierarchical latent variable models. In this paper we propose Bit-Swap, a new compression scheme that generalizes BB-ANS and achieves strictly better compression rates for hierarchical latent variable models with Markov chain structure. Through experiments we verify that Bit-Swap results in lossless compression rates that are empirically superior to existing techniques.} }
Endnote
%0 Conference Paper %T Bit-Swap: Recursive Bits-Back Coding for Lossless Compression with Hierarchical Latent Variables %A Friso Kingma %A Pieter Abbeel %A Jonathan Ho %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-kingma19a %I PMLR %P 3408--3417 %U https://proceedings.mlr.press/v97/kingma19a.html %V 97 %X The bits-back argument suggests that latent variable models can be turned into lossless compression schemes. Translating the bits-back argument into efficient and practical lossless compression schemes for general latent variable models, however, is still an open problem. Bits-Back with Asymmetric Numeral Systems (BB-ANS), recently proposed by Townsend et al,. 2019, makes bits-back coding practically feasible for latent variable models with one latent layer, but it is inefficient for hierarchical latent variable models. In this paper we propose Bit-Swap, a new compression scheme that generalizes BB-ANS and achieves strictly better compression rates for hierarchical latent variable models with Markov chain structure. Through experiments we verify that Bit-Swap results in lossless compression rates that are empirically superior to existing techniques.
APA
Kingma, F., Abbeel, P. & Ho, J.. (2019). Bit-Swap: Recursive Bits-Back Coding for Lossless Compression with Hierarchical Latent Variables. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:3408-3417 Available from https://proceedings.mlr.press/v97/kingma19a.html.

Related Material