AR-DAE: Towards Unbiased Neural Entropy Gradient Estimation

Jae Hyun Lim, Aaron Courville, Christopher Pal, Chin-Wei Huang
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:6061-6071, 2020.

Abstract

Entropy is ubiquitous in machine learning, but it is in general intractable to compute the entropy of the distribution of an arbitrary continuous random variable. In this paper, we propose the amortized residual denoising autoencoder (AR-DAE) to approximate the gradient of the log density function, which can be used to estimate the gradient of entropy. Amortization allows us to significantly reduce the error of the gradient approximator by approaching asymptotic optimality of a regular DAE, in which case the estimation is in theory unbiased. We conduct theoretical and experimental analyses on the approximation error of the proposed method, as well as extensive studies on heuristics to ensure its robustness. Finally, using the proposed gradient approximator to estimate the gradient of entropy, we demonstrate state-of-the-art performance on density estimation with variational autoencoders and continuous control with soft actor-critic.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-lim20a, title = {{AR}-{DAE}: Towards Unbiased Neural Entropy Gradient Estimation}, author = {Lim, Jae Hyun and Courville, Aaron and Pal, Christopher and Huang, Chin-Wei}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {6061--6071}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/lim20a/lim20a.pdf}, url = {https://proceedings.mlr.press/v119/lim20a.html}, abstract = {Entropy is ubiquitous in machine learning, but it is in general intractable to compute the entropy of the distribution of an arbitrary continuous random variable. In this paper, we propose the amortized residual denoising autoencoder (AR-DAE) to approximate the gradient of the log density function, which can be used to estimate the gradient of entropy. Amortization allows us to significantly reduce the error of the gradient approximator by approaching asymptotic optimality of a regular DAE, in which case the estimation is in theory unbiased. We conduct theoretical and experimental analyses on the approximation error of the proposed method, as well as extensive studies on heuristics to ensure its robustness. Finally, using the proposed gradient approximator to estimate the gradient of entropy, we demonstrate state-of-the-art performance on density estimation with variational autoencoders and continuous control with soft actor-critic.} }
Endnote
%0 Conference Paper %T AR-DAE: Towards Unbiased Neural Entropy Gradient Estimation %A Jae Hyun Lim %A Aaron Courville %A Christopher Pal %A Chin-Wei Huang %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-lim20a %I PMLR %P 6061--6071 %U https://proceedings.mlr.press/v119/lim20a.html %V 119 %X Entropy is ubiquitous in machine learning, but it is in general intractable to compute the entropy of the distribution of an arbitrary continuous random variable. In this paper, we propose the amortized residual denoising autoencoder (AR-DAE) to approximate the gradient of the log density function, which can be used to estimate the gradient of entropy. Amortization allows us to significantly reduce the error of the gradient approximator by approaching asymptotic optimality of a regular DAE, in which case the estimation is in theory unbiased. We conduct theoretical and experimental analyses on the approximation error of the proposed method, as well as extensive studies on heuristics to ensure its robustness. Finally, using the proposed gradient approximator to estimate the gradient of entropy, we demonstrate state-of-the-art performance on density estimation with variational autoencoders and continuous control with soft actor-critic.
APA
Lim, J.H., Courville, A., Pal, C. & Huang, C.. (2020). AR-DAE: Towards Unbiased Neural Entropy Gradient Estimation. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:6061-6071 Available from https://proceedings.mlr.press/v119/lim20a.html.

Related Material