DeepPolar: Inventing Nonlinear Large-Kernel Polar Codes via Deep Learning

S Ashwin Hebbar, Sravan Kumar Ankireddy, Hyeji Kim, Sewoong Oh, Pramod Viswanath
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:18133-18154, 2024.

Abstract

Progress in designing channel codes has been driven by human ingenuity and, fittingly, has been sporadic. Polar codes, developed on the foundation of Arikan’s polarization kernel, represent the latest breakthrough in coding theory and have emerged as the state-of-the-art error-correction code for short-to-medium block length regimes. In an effort to automate the invention of good channel codes, especially in this regime, we explore a novel, non-linear generalization of Polar codes, which we call DeepPolar codes. DeepPolar codes extend the conventional Polar coding framework by utilizing a larger kernel size and parameterizing these kernels and matched decoders through neural networks. Our results demonstrate that these data-driven codes effectively leverage the benefits of a larger kernel size, resulting in enhanced reliability when compared to both existing neural codes and conventional Polar codes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-hebbar24a, title = {{D}eep{P}olar: Inventing Nonlinear Large-Kernel Polar Codes via Deep Learning}, author = {Hebbar, S Ashwin and Ankireddy, Sravan Kumar and Kim, Hyeji and Oh, Sewoong and Viswanath, Pramod}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {18133--18154}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/hebbar24a/hebbar24a.pdf}, url = {https://proceedings.mlr.press/v235/hebbar24a.html}, abstract = {Progress in designing channel codes has been driven by human ingenuity and, fittingly, has been sporadic. Polar codes, developed on the foundation of Arikan’s polarization kernel, represent the latest breakthrough in coding theory and have emerged as the state-of-the-art error-correction code for short-to-medium block length regimes. In an effort to automate the invention of good channel codes, especially in this regime, we explore a novel, non-linear generalization of Polar codes, which we call DeepPolar codes. DeepPolar codes extend the conventional Polar coding framework by utilizing a larger kernel size and parameterizing these kernels and matched decoders through neural networks. Our results demonstrate that these data-driven codes effectively leverage the benefits of a larger kernel size, resulting in enhanced reliability when compared to both existing neural codes and conventional Polar codes.} }
Endnote
%0 Conference Paper %T DeepPolar: Inventing Nonlinear Large-Kernel Polar Codes via Deep Learning %A S Ashwin Hebbar %A Sravan Kumar Ankireddy %A Hyeji Kim %A Sewoong Oh %A Pramod Viswanath %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-hebbar24a %I PMLR %P 18133--18154 %U https://proceedings.mlr.press/v235/hebbar24a.html %V 235 %X Progress in designing channel codes has been driven by human ingenuity and, fittingly, has been sporadic. Polar codes, developed on the foundation of Arikan’s polarization kernel, represent the latest breakthrough in coding theory and have emerged as the state-of-the-art error-correction code for short-to-medium block length regimes. In an effort to automate the invention of good channel codes, especially in this regime, we explore a novel, non-linear generalization of Polar codes, which we call DeepPolar codes. DeepPolar codes extend the conventional Polar coding framework by utilizing a larger kernel size and parameterizing these kernels and matched decoders through neural networks. Our results demonstrate that these data-driven codes effectively leverage the benefits of a larger kernel size, resulting in enhanced reliability when compared to both existing neural codes and conventional Polar codes.
APA
Hebbar, S.A., Ankireddy, S.K., Kim, H., Oh, S. & Viswanath, P.. (2024). DeepPolar: Inventing Nonlinear Large-Kernel Polar Codes via Deep Learning. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:18133-18154 Available from https://proceedings.mlr.press/v235/hebbar24a.html.

Related Material