LotteryCodec: Searching the Implicit Representation in a Random Network for Low-Complexity Image Compression

Haotian Wu, Gongpu Chen, Pier Luigi Dragotti, Deniz Gunduz
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:67147-67171, 2025.

Abstract

We introduce and validate the lottery codec hypothesis, which states that untrained subnetworks within randomly initialized networks can serve as synthesis networks for overfitted image compression, achieving rate-distortion (RD) performance comparable to trained networks. This hypothesis leads to a new paradigm for image compression by encoding image statistics into the network substructure. Building on this hypothesis, we propose LotteryCodec, which overfits a binary mask to an individual image, leveraging an over-parameterized and randomly initialized network shared by the encoder and the decoder. To address over-parameterization challenges and streamline subnetwork search, we develop a rewind modulation mechanism that improves the RD performance. LotteryCodec outperforms VTM and sets a new state-of-the-art in single-image compression. LotteryCodec also enables adaptive decoding complexity through adjustable mask ratios, offering flexible compression solutions for diverse device constraints and application requirements.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-wu25e, title = {{L}ottery{C}odec: Searching the Implicit Representation in a Random Network for Low-Complexity Image Compression}, author = {Wu, Haotian and Chen, Gongpu and Dragotti, Pier Luigi and Gunduz, Deniz}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {67147--67171}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/wu25e/wu25e.pdf}, url = {https://proceedings.mlr.press/v267/wu25e.html}, abstract = {We introduce and validate the lottery codec hypothesis, which states that untrained subnetworks within randomly initialized networks can serve as synthesis networks for overfitted image compression, achieving rate-distortion (RD) performance comparable to trained networks. This hypothesis leads to a new paradigm for image compression by encoding image statistics into the network substructure. Building on this hypothesis, we propose LotteryCodec, which overfits a binary mask to an individual image, leveraging an over-parameterized and randomly initialized network shared by the encoder and the decoder. To address over-parameterization challenges and streamline subnetwork search, we develop a rewind modulation mechanism that improves the RD performance. LotteryCodec outperforms VTM and sets a new state-of-the-art in single-image compression. LotteryCodec also enables adaptive decoding complexity through adjustable mask ratios, offering flexible compression solutions for diverse device constraints and application requirements.} }
Endnote
%0 Conference Paper %T LotteryCodec: Searching the Implicit Representation in a Random Network for Low-Complexity Image Compression %A Haotian Wu %A Gongpu Chen %A Pier Luigi Dragotti %A Deniz Gunduz %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-wu25e %I PMLR %P 67147--67171 %U https://proceedings.mlr.press/v267/wu25e.html %V 267 %X We introduce and validate the lottery codec hypothesis, which states that untrained subnetworks within randomly initialized networks can serve as synthesis networks for overfitted image compression, achieving rate-distortion (RD) performance comparable to trained networks. This hypothesis leads to a new paradigm for image compression by encoding image statistics into the network substructure. Building on this hypothesis, we propose LotteryCodec, which overfits a binary mask to an individual image, leveraging an over-parameterized and randomly initialized network shared by the encoder and the decoder. To address over-parameterization challenges and streamline subnetwork search, we develop a rewind modulation mechanism that improves the RD performance. LotteryCodec outperforms VTM and sets a new state-of-the-art in single-image compression. LotteryCodec also enables adaptive decoding complexity through adjustable mask ratios, offering flexible compression solutions for diverse device constraints and application requirements.
APA
Wu, H., Chen, G., Dragotti, P.L. & Gunduz, D.. (2025). LotteryCodec: Searching the Implicit Representation in a Random Network for Low-Complexity Image Compression. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:67147-67171 Available from https://proceedings.mlr.press/v267/wu25e.html.

Related Material