Compressive sensing with un-trained neural networks: Gradient descent finds a smooth approximation

Reinhard Heckel, Mahdi Soltanolkotabi
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:4149-4158, 2020.

Abstract

Un-trained convolutional neural networks have emerged as highly successful tools for image recovery and restoration. They are capable of solving standard inverse problems such as denoising and compressive sensing with excellent results by simply fitting a neural network model to measurements from a single image or signal without the need for any additional training data. For some applications, this critically requires additional regularization in the form of early stopping the optimization. For signal recovery from a few measurements, however, un-trained convolutional networks have an intriguing self-regularizing property: Even though the network can perfectly fit any image, the network recovers a natural image from few measurements when trained with gradient descent until convergence. In this paper, we provide numerical evidence for this property and study it theoretically. We show that—without any further regularization—an un-trained convolutional neural network can approximately reconstruct signals and images that are sufficiently structured, from a near minimal number of random measurements.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-heckel20a, title = {Compressive sensing with un-trained neural networks: Gradient descent finds a smooth approximation}, author = {Heckel, Reinhard and Soltanolkotabi, Mahdi}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {4149--4158}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/heckel20a/heckel20a.pdf}, url = {https://proceedings.mlr.press/v119/heckel20a.html}, abstract = {Un-trained convolutional neural networks have emerged as highly successful tools for image recovery and restoration. They are capable of solving standard inverse problems such as denoising and compressive sensing with excellent results by simply fitting a neural network model to measurements from a single image or signal without the need for any additional training data. For some applications, this critically requires additional regularization in the form of early stopping the optimization. For signal recovery from a few measurements, however, un-trained convolutional networks have an intriguing self-regularizing property: Even though the network can perfectly fit any image, the network recovers a natural image from few measurements when trained with gradient descent until convergence. In this paper, we provide numerical evidence for this property and study it theoretically. We show that—without any further regularization—an un-trained convolutional neural network can approximately reconstruct signals and images that are sufficiently structured, from a near minimal number of random measurements.} }
Endnote
%0 Conference Paper %T Compressive sensing with un-trained neural networks: Gradient descent finds a smooth approximation %A Reinhard Heckel %A Mahdi Soltanolkotabi %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-heckel20a %I PMLR %P 4149--4158 %U https://proceedings.mlr.press/v119/heckel20a.html %V 119 %X Un-trained convolutional neural networks have emerged as highly successful tools for image recovery and restoration. They are capable of solving standard inverse problems such as denoising and compressive sensing with excellent results by simply fitting a neural network model to measurements from a single image or signal without the need for any additional training data. For some applications, this critically requires additional regularization in the form of early stopping the optimization. For signal recovery from a few measurements, however, un-trained convolutional networks have an intriguing self-regularizing property: Even though the network can perfectly fit any image, the network recovers a natural image from few measurements when trained with gradient descent until convergence. In this paper, we provide numerical evidence for this property and study it theoretically. We show that—without any further regularization—an un-trained convolutional neural network can approximately reconstruct signals and images that are sufficiently structured, from a near minimal number of random measurements.
APA
Heckel, R. & Soltanolkotabi, M.. (2020). Compressive sensing with un-trained neural networks: Gradient descent finds a smooth approximation. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:4149-4158 Available from https://proceedings.mlr.press/v119/heckel20a.html.

Related Material