Improving Statistical Fidelity for Neural Image Compression with Implicit Local Likelihood Models

Matthew J. Muckley, Alaaeldin El-Nouby, Karen Ullrich, Herve Jegou, Jakob Verbeek
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:25426-25443, 2023.

Abstract

Lossy image compression aims to represent images in as few bits as possible while maintaining fidelity to the original. Theoretical results indicate that optimizing distortion metrics such as PSNR or MS-SSIM necessarily leads to a discrepancy in the statistics of original images from those of reconstructions, in particular at low bitrates, often manifested by the blurring of the compressed images. Previous work has leveraged adversarial discriminators to improve statistical fidelity. Yet these binary discriminators adopted from generative modeling tasks may not be ideal for image compression. In this paper, we introduce a non-binary discriminator that is conditioned on quantized local image representations obtained via VQ-VAE autoencoders. Our evaluations on the CLIC2020, DIV2K and Kodak datasets show that our discriminator is more effective for jointly optimizing distortion (e.g., PSNR) and statistical fidelity (e.g., FID) than the PatchGAN of the state-of-the-art HiFiC model. On CLIC2020, we obtain the same FID as HiFiC with 30-40% fewer bits.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-muckley23a, title = {Improving Statistical Fidelity for Neural Image Compression with Implicit Local Likelihood Models}, author = {Muckley, Matthew J. and El-Nouby, Alaaeldin and Ullrich, Karen and Jegou, Herve and Verbeek, Jakob}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {25426--25443}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/muckley23a/muckley23a.pdf}, url = {https://proceedings.mlr.press/v202/muckley23a.html}, abstract = {Lossy image compression aims to represent images in as few bits as possible while maintaining fidelity to the original. Theoretical results indicate that optimizing distortion metrics such as PSNR or MS-SSIM necessarily leads to a discrepancy in the statistics of original images from those of reconstructions, in particular at low bitrates, often manifested by the blurring of the compressed images. Previous work has leveraged adversarial discriminators to improve statistical fidelity. Yet these binary discriminators adopted from generative modeling tasks may not be ideal for image compression. In this paper, we introduce a non-binary discriminator that is conditioned on quantized local image representations obtained via VQ-VAE autoencoders. Our evaluations on the CLIC2020, DIV2K and Kodak datasets show that our discriminator is more effective for jointly optimizing distortion (e.g., PSNR) and statistical fidelity (e.g., FID) than the PatchGAN of the state-of-the-art HiFiC model. On CLIC2020, we obtain the same FID as HiFiC with 30-40% fewer bits.} }
Endnote
%0 Conference Paper %T Improving Statistical Fidelity for Neural Image Compression with Implicit Local Likelihood Models %A Matthew J. Muckley %A Alaaeldin El-Nouby %A Karen Ullrich %A Herve Jegou %A Jakob Verbeek %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-muckley23a %I PMLR %P 25426--25443 %U https://proceedings.mlr.press/v202/muckley23a.html %V 202 %X Lossy image compression aims to represent images in as few bits as possible while maintaining fidelity to the original. Theoretical results indicate that optimizing distortion metrics such as PSNR or MS-SSIM necessarily leads to a discrepancy in the statistics of original images from those of reconstructions, in particular at low bitrates, often manifested by the blurring of the compressed images. Previous work has leveraged adversarial discriminators to improve statistical fidelity. Yet these binary discriminators adopted from generative modeling tasks may not be ideal for image compression. In this paper, we introduce a non-binary discriminator that is conditioned on quantized local image representations obtained via VQ-VAE autoencoders. Our evaluations on the CLIC2020, DIV2K and Kodak datasets show that our discriminator is more effective for jointly optimizing distortion (e.g., PSNR) and statistical fidelity (e.g., FID) than the PatchGAN of the state-of-the-art HiFiC model. On CLIC2020, we obtain the same FID as HiFiC with 30-40% fewer bits.
APA
Muckley, M.J., El-Nouby, A., Ullrich, K., Jegou, H. & Verbeek, J.. (2023). Improving Statistical Fidelity for Neural Image Compression with Implicit Local Likelihood Models. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:25426-25443 Available from https://proceedings.mlr.press/v202/muckley23a.html.

Related Material