Principal Bit Analysis: Autoencoding with Schur-Concave Loss

Sourbh Bhadane, Aaron B Wagner, Jayadev Acharya
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:852-862, 2021.

Abstract

We consider a linear autoencoder in which the latent variables are quantized, or corrupted by noise, and the constraint is Schur-concave in the set of latent variances. Although finding the optimal encoder/decoder pair for this setup is a nonconvex optimization problem, we show that decomposing the source into its principal components is optimal. If the constraint is strictly Schur-concave and the empirical covariance matrix has only simple eigenvalues, then any optimal encoder/decoder must decompose the source in this way. As one application, we consider a strictly Schur-concave constraint that estimates the number of bits needed to represent the latent variables under fixed-rate encoding, a setup that we call \emph{Principal Bit Analysis (PBA)}. This yields a practical, general-purpose, fixed-rate compressor that outperforms existing algorithms. As a second application, we show that a prototypical autoencoder-based variable-rate compressor is guaranteed to decompose the source into its principal components.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-bhadane21a, title = {Principal Bit Analysis: Autoencoding with Schur-Concave Loss}, author = {Bhadane, Sourbh and Wagner, Aaron B and Acharya, Jayadev}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {852--862}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/bhadane21a/bhadane21a.pdf}, url = {https://proceedings.mlr.press/v139/bhadane21a.html}, abstract = {We consider a linear autoencoder in which the latent variables are quantized, or corrupted by noise, and the constraint is Schur-concave in the set of latent variances. Although finding the optimal encoder/decoder pair for this setup is a nonconvex optimization problem, we show that decomposing the source into its principal components is optimal. If the constraint is strictly Schur-concave and the empirical covariance matrix has only simple eigenvalues, then any optimal encoder/decoder must decompose the source in this way. As one application, we consider a strictly Schur-concave constraint that estimates the number of bits needed to represent the latent variables under fixed-rate encoding, a setup that we call \emph{Principal Bit Analysis (PBA)}. This yields a practical, general-purpose, fixed-rate compressor that outperforms existing algorithms. As a second application, we show that a prototypical autoencoder-based variable-rate compressor is guaranteed to decompose the source into its principal components.} }
Endnote
%0 Conference Paper %T Principal Bit Analysis: Autoencoding with Schur-Concave Loss %A Sourbh Bhadane %A Aaron B Wagner %A Jayadev Acharya %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-bhadane21a %I PMLR %P 852--862 %U https://proceedings.mlr.press/v139/bhadane21a.html %V 139 %X We consider a linear autoencoder in which the latent variables are quantized, or corrupted by noise, and the constraint is Schur-concave in the set of latent variances. Although finding the optimal encoder/decoder pair for this setup is a nonconvex optimization problem, we show that decomposing the source into its principal components is optimal. If the constraint is strictly Schur-concave and the empirical covariance matrix has only simple eigenvalues, then any optimal encoder/decoder must decompose the source in this way. As one application, we consider a strictly Schur-concave constraint that estimates the number of bits needed to represent the latent variables under fixed-rate encoding, a setup that we call \emph{Principal Bit Analysis (PBA)}. This yields a practical, general-purpose, fixed-rate compressor that outperforms existing algorithms. As a second application, we show that a prototypical autoencoder-based variable-rate compressor is guaranteed to decompose the source into its principal components.
APA
Bhadane, S., Wagner, A.B. & Acharya, J.. (2021). Principal Bit Analysis: Autoencoding with Schur-Concave Loss. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:852-862 Available from https://proceedings.mlr.press/v139/bhadane21a.html.

Related Material