Steinmetz Neural Networks for Complex-Valued Data

Shyam Venkatasubramanian, Ali Pezeshki, Vahid Tarokh
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:3916-3924, 2025.

Abstract

We introduce a new approach to processing complex-valued data using DNNs consisting of parallel real-valued subnetworks with coupled outputs. Our proposed class of architectures, referred to as Steinmetz Neural Networks, incorporates multi-view learning to construct more interpretable representations in the latent space. Moreover, we present the Analytic Neural Network, which incorporates a consistency penalty that encourages analytic signal representations in the latent space of the Steinmetz neural network. This penalty enforces a deterministic and orthogonal relationship between the real and imaginary components. Using an information-theoretic construction, we demonstrate that the generalization gap upper bound posited by the analytic neural network is lower than that of the general class of Steinmetz neural networks. Our numerical experiments depict the improved performance and robustness to additive noise, afforded by our proposed networks on benchmark datasets and synthetic examples.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-venkatasubramanian25a, title = {Steinmetz Neural Networks for Complex-Valued Data}, author = {Venkatasubramanian, Shyam and Pezeshki, Ali and Tarokh, Vahid}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {3916--3924}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/venkatasubramanian25a/venkatasubramanian25a.pdf}, url = {https://proceedings.mlr.press/v258/venkatasubramanian25a.html}, abstract = {We introduce a new approach to processing complex-valued data using DNNs consisting of parallel real-valued subnetworks with coupled outputs. Our proposed class of architectures, referred to as Steinmetz Neural Networks, incorporates multi-view learning to construct more interpretable representations in the latent space. Moreover, we present the Analytic Neural Network, which incorporates a consistency penalty that encourages analytic signal representations in the latent space of the Steinmetz neural network. This penalty enforces a deterministic and orthogonal relationship between the real and imaginary components. Using an information-theoretic construction, we demonstrate that the generalization gap upper bound posited by the analytic neural network is lower than that of the general class of Steinmetz neural networks. Our numerical experiments depict the improved performance and robustness to additive noise, afforded by our proposed networks on benchmark datasets and synthetic examples.} }
Endnote
%0 Conference Paper %T Steinmetz Neural Networks for Complex-Valued Data %A Shyam Venkatasubramanian %A Ali Pezeshki %A Vahid Tarokh %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-venkatasubramanian25a %I PMLR %P 3916--3924 %U https://proceedings.mlr.press/v258/venkatasubramanian25a.html %V 258 %X We introduce a new approach to processing complex-valued data using DNNs consisting of parallel real-valued subnetworks with coupled outputs. Our proposed class of architectures, referred to as Steinmetz Neural Networks, incorporates multi-view learning to construct more interpretable representations in the latent space. Moreover, we present the Analytic Neural Network, which incorporates a consistency penalty that encourages analytic signal representations in the latent space of the Steinmetz neural network. This penalty enforces a deterministic and orthogonal relationship between the real and imaginary components. Using an information-theoretic construction, we demonstrate that the generalization gap upper bound posited by the analytic neural network is lower than that of the general class of Steinmetz neural networks. Our numerical experiments depict the improved performance and robustness to additive noise, afforded by our proposed networks on benchmark datasets and synthetic examples.
APA
Venkatasubramanian, S., Pezeshki, A. & Tarokh, V.. (2025). Steinmetz Neural Networks for Complex-Valued Data. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:3916-3924 Available from https://proceedings.mlr.press/v258/venkatasubramanian25a.html.

Related Material