Fisher Auto-Encoders

Khalil Elkhalil, Ali Hasan, Jie Ding, Sina Farsiu, Vahid Tarokh
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:352-360, 2021.

Abstract

It has been conjectured that the Fisher divergence is more robust to model uncertainty than the conventional Kullback-Leibler (KL) divergence. This motivates the design of a new class of robust generative auto-encoders (AE) referred to as Fisher auto-encoders. Our approach is to design Fisher AEs by minimizing the Fisher divergence between the intractable joint distribution of observed data and latent variables, with that of the postulated/modeled joint distribution. In contrast to KL-based variational AEs (VAEs), the Fisher AE can exactly quantify the distance between the true and the model-based posterior distributions. Qualitative and quantitative results are provided on both MNIST and celebA datasets demonstrating the competitive performance of Fisher AEs in terms of robustness compared to other AEs such as VAEs and Wasserstein AEs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-elkhalil21a, title = { Fisher Auto-Encoders }, author = {Elkhalil, Khalil and Hasan, Ali and Ding, Jie and Farsiu, Sina and Tarokh, Vahid}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {352--360}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/elkhalil21a/elkhalil21a.pdf}, url = {https://proceedings.mlr.press/v130/elkhalil21a.html}, abstract = { It has been conjectured that the Fisher divergence is more robust to model uncertainty than the conventional Kullback-Leibler (KL) divergence. This motivates the design of a new class of robust generative auto-encoders (AE) referred to as Fisher auto-encoders. Our approach is to design Fisher AEs by minimizing the Fisher divergence between the intractable joint distribution of observed data and latent variables, with that of the postulated/modeled joint distribution. In contrast to KL-based variational AEs (VAEs), the Fisher AE can exactly quantify the distance between the true and the model-based posterior distributions. Qualitative and quantitative results are provided on both MNIST and celebA datasets demonstrating the competitive performance of Fisher AEs in terms of robustness compared to other AEs such as VAEs and Wasserstein AEs. } }
Endnote
%0 Conference Paper %T Fisher Auto-Encoders %A Khalil Elkhalil %A Ali Hasan %A Jie Ding %A Sina Farsiu %A Vahid Tarokh %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-elkhalil21a %I PMLR %P 352--360 %U https://proceedings.mlr.press/v130/elkhalil21a.html %V 130 %X It has been conjectured that the Fisher divergence is more robust to model uncertainty than the conventional Kullback-Leibler (KL) divergence. This motivates the design of a new class of robust generative auto-encoders (AE) referred to as Fisher auto-encoders. Our approach is to design Fisher AEs by minimizing the Fisher divergence between the intractable joint distribution of observed data and latent variables, with that of the postulated/modeled joint distribution. In contrast to KL-based variational AEs (VAEs), the Fisher AE can exactly quantify the distance between the true and the model-based posterior distributions. Qualitative and quantitative results are provided on both MNIST and celebA datasets demonstrating the competitive performance of Fisher AEs in terms of robustness compared to other AEs such as VAEs and Wasserstein AEs.
APA
Elkhalil, K., Hasan, A., Ding, J., Farsiu, S. & Tarokh, V.. (2021). Fisher Auto-Encoders . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:352-360 Available from https://proceedings.mlr.press/v130/elkhalil21a.html.

Related Material