Learning the Electronic Hamiltonian of Large Atomic Structures

Chen Hao Xia, Manasa Kaniselvan, Alexandros Nikolaos Ziogas, Marko Mladenović, Rayen Mahjoub, Alexander Maeder, Mathieu Luisier
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:68236-68257, 2025.

Abstract

Graph neural networks (GNNs) have shown promise in learning the ground-state electronic properties of materials, subverting ab initio density functional theory (DFT) calculations when the underlying lattices can be represented as small and/or repeatable unit cells (i.e., molecules and periodic crystals). Realistic systems are, however, non-ideal and generally characterized by higher structural complexity. As such, they require large (10+ {Å}) unit cells and thousands of atoms to be accurately described. At these scales, DFT becomes computationally prohibitive, making GNNs especially attractive. In this work, we present a strictly local equivariant GNN capable of learning the electronic Hamiltonian (H) of realistically extended materials. It incorporates an augmented partitioning approach that enables training on arbitrarily large structures while preserving local atomic environments beyond boundaries. We demonstrate its capabilities by predicting the electronic Hamiltonian of various systems with up to 3,000 nodes (atoms), 500,000+ edges, 28 million orbital interactions (nonzero entries of H), and $\leq$0.53% error in the eigenvalue spectra. Our work expands the applicability of current electronic property prediction methods to some of the most challenging cases encountered in computational materials science, namely systems with disorder, interfaces, and defects.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-xia25b, title = {Learning the Electronic {H}amiltonian of Large Atomic Structures}, author = {Xia, Chen Hao and Kaniselvan, Manasa and Ziogas, Alexandros Nikolaos and Mladenovi\'{c}, Marko and Mahjoub, Rayen and Maeder, Alexander and Luisier, Mathieu}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {68236--68257}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/xia25b/xia25b.pdf}, url = {https://proceedings.mlr.press/v267/xia25b.html}, abstract = {Graph neural networks (GNNs) have shown promise in learning the ground-state electronic properties of materials, subverting ab initio density functional theory (DFT) calculations when the underlying lattices can be represented as small and/or repeatable unit cells (i.e., molecules and periodic crystals). Realistic systems are, however, non-ideal and generally characterized by higher structural complexity. As such, they require large (10+ {Å}) unit cells and thousands of atoms to be accurately described. At these scales, DFT becomes computationally prohibitive, making GNNs especially attractive. In this work, we present a strictly local equivariant GNN capable of learning the electronic Hamiltonian (H) of realistically extended materials. It incorporates an augmented partitioning approach that enables training on arbitrarily large structures while preserving local atomic environments beyond boundaries. We demonstrate its capabilities by predicting the electronic Hamiltonian of various systems with up to 3,000 nodes (atoms), 500,000+ edges, 28 million orbital interactions (nonzero entries of H), and $\leq$0.53% error in the eigenvalue spectra. Our work expands the applicability of current electronic property prediction methods to some of the most challenging cases encountered in computational materials science, namely systems with disorder, interfaces, and defects.} }
Endnote
%0 Conference Paper %T Learning the Electronic Hamiltonian of Large Atomic Structures %A Chen Hao Xia %A Manasa Kaniselvan %A Alexandros Nikolaos Ziogas %A Marko Mladenović %A Rayen Mahjoub %A Alexander Maeder %A Mathieu Luisier %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-xia25b %I PMLR %P 68236--68257 %U https://proceedings.mlr.press/v267/xia25b.html %V 267 %X Graph neural networks (GNNs) have shown promise in learning the ground-state electronic properties of materials, subverting ab initio density functional theory (DFT) calculations when the underlying lattices can be represented as small and/or repeatable unit cells (i.e., molecules and periodic crystals). Realistic systems are, however, non-ideal and generally characterized by higher structural complexity. As such, they require large (10+ {Å}) unit cells and thousands of atoms to be accurately described. At these scales, DFT becomes computationally prohibitive, making GNNs especially attractive. In this work, we present a strictly local equivariant GNN capable of learning the electronic Hamiltonian (H) of realistically extended materials. It incorporates an augmented partitioning approach that enables training on arbitrarily large structures while preserving local atomic environments beyond boundaries. We demonstrate its capabilities by predicting the electronic Hamiltonian of various systems with up to 3,000 nodes (atoms), 500,000+ edges, 28 million orbital interactions (nonzero entries of H), and $\leq$0.53% error in the eigenvalue spectra. Our work expands the applicability of current electronic property prediction methods to some of the most challenging cases encountered in computational materials science, namely systems with disorder, interfaces, and defects.
APA
Xia, C.H., Kaniselvan, M., Ziogas, A.N., Mladenović, M., Mahjoub, R., Maeder, A. & Luisier, M.. (2025). Learning the Electronic Hamiltonian of Large Atomic Structures. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:68236-68257 Available from https://proceedings.mlr.press/v267/xia25b.html.

Related Material