Optimal Eye Surgeon: Finding image priors through sparse generators at initialization

Avrajit Ghosh, Xitong Zhang, Kenneth K. Sun, Qing Qu, Saiprasad Ravishankar, Rongrong Wang
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:15610-15645, 2024.

Abstract

We introduce Optimal Eye Surgeon (OES), a framework for pruning and training deep image generator networks. Typically, untrained deep convolutional networks, which include image sampling operations, serve as effective image priors. However, they tend to overfit to noise in image restoration tasks due to being overparameterized. OES addresses this by adaptively pruning networks at random initialization to a level of underparameterization. This process effectively captures low-frequency image components even without training, by just masking. When trained to fit noisy image, these pruned subnetworks, which we term Sparse-DIP, resist overfitting to noise. This benefit arises from underparameterization and the regularization effect of masking, constraining them in the manifold of image priors. We demonstrate that subnetworks pruned through OES surpass other leading pruning methods, such as the Lottery Ticket Hypothesis, which is known to be suboptimal for image recovery tasks. Our extensive experiments demonstrate the transferability of OES-masks and the characteristics of sparse-subnetworks for image generation. Code is available at https://github.com/Avra98/Optimal-Eye-Surgeon.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-ghosh24c, title = {Optimal Eye Surgeon: Finding image priors through sparse generators at initialization}, author = {Ghosh, Avrajit and Zhang, Xitong and Sun, Kenneth K. and Qu, Qing and Ravishankar, Saiprasad and Wang, Rongrong}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {15610--15645}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/ghosh24c/ghosh24c.pdf}, url = {https://proceedings.mlr.press/v235/ghosh24c.html}, abstract = {We introduce Optimal Eye Surgeon (OES), a framework for pruning and training deep image generator networks. Typically, untrained deep convolutional networks, which include image sampling operations, serve as effective image priors. However, they tend to overfit to noise in image restoration tasks due to being overparameterized. OES addresses this by adaptively pruning networks at random initialization to a level of underparameterization. This process effectively captures low-frequency image components even without training, by just masking. When trained to fit noisy image, these pruned subnetworks, which we term Sparse-DIP, resist overfitting to noise. This benefit arises from underparameterization and the regularization effect of masking, constraining them in the manifold of image priors. We demonstrate that subnetworks pruned through OES surpass other leading pruning methods, such as the Lottery Ticket Hypothesis, which is known to be suboptimal for image recovery tasks. Our extensive experiments demonstrate the transferability of OES-masks and the characteristics of sparse-subnetworks for image generation. Code is available at https://github.com/Avra98/Optimal-Eye-Surgeon.} }
Endnote
%0 Conference Paper %T Optimal Eye Surgeon: Finding image priors through sparse generators at initialization %A Avrajit Ghosh %A Xitong Zhang %A Kenneth K. Sun %A Qing Qu %A Saiprasad Ravishankar %A Rongrong Wang %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-ghosh24c %I PMLR %P 15610--15645 %U https://proceedings.mlr.press/v235/ghosh24c.html %V 235 %X We introduce Optimal Eye Surgeon (OES), a framework for pruning and training deep image generator networks. Typically, untrained deep convolutional networks, which include image sampling operations, serve as effective image priors. However, they tend to overfit to noise in image restoration tasks due to being overparameterized. OES addresses this by adaptively pruning networks at random initialization to a level of underparameterization. This process effectively captures low-frequency image components even without training, by just masking. When trained to fit noisy image, these pruned subnetworks, which we term Sparse-DIP, resist overfitting to noise. This benefit arises from underparameterization and the regularization effect of masking, constraining them in the manifold of image priors. We demonstrate that subnetworks pruned through OES surpass other leading pruning methods, such as the Lottery Ticket Hypothesis, which is known to be suboptimal for image recovery tasks. Our extensive experiments demonstrate the transferability of OES-masks and the characteristics of sparse-subnetworks for image generation. Code is available at https://github.com/Avra98/Optimal-Eye-Surgeon.
APA
Ghosh, A., Zhang, X., Sun, K.K., Qu, Q., Ravishankar, S. & Wang, R.. (2024). Optimal Eye Surgeon: Finding image priors through sparse generators at initialization. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:15610-15645 Available from https://proceedings.mlr.press/v235/ghosh24c.html.

Related Material