Morphological-Symmetry-Equivariant Heterogeneous Graph Neural Network for Robotic Dynamics Learning

Fengze Xie, Sizhe Wei, Yue Song, Yisong Yue, Lu Gan
Proceedings of the 7th Annual Learning for Dynamics \& Control Conference, PMLR 283:1392-1405, 2025.

Abstract

We propose MS-HGNN, a Morphological-Symmetry-Equivariant Heterogeneous Graph Neural Network for robotic dynamics learning, which integrates robotic kinematic structures and morphological symmetries into a unified graph network. By embedding these structural priors as inductive biases, MS-HGNN ensures high generalizability, sample and model efficiency. This architecture is versatile and broadly applicable to various multi-body dynamic systems and dynamics learning tasks. We prove the morphological-symmetry-equivariant property of MS-HGNN and demonstrate its effectiveness across multiple quadruped robot dynamics learning problems using real-world and simulated data. Our code is available at https://github.com/lunarlab-gatech/MorphSym-HGNN/.

Cite this Paper


BibTeX
@InProceedings{pmlr-v283-xie25a, title = {Morphological-Symmetry-Equivariant Heterogeneous Graph Neural Network for Robotic Dynamics Learning}, author = {Xie, Fengze and Wei, Sizhe and Song, Yue and Yue, Yisong and Gan, Lu}, booktitle = {Proceedings of the 7th Annual Learning for Dynamics \& Control Conference}, pages = {1392--1405}, year = {2025}, editor = {Ozay, Necmiye and Balzano, Laura and Panagou, Dimitra and Abate, Alessandro}, volume = {283}, series = {Proceedings of Machine Learning Research}, month = {04--06 Jun}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v283/main/assets/xie25a/xie25a.pdf}, url = {https://proceedings.mlr.press/v283/xie25a.html}, abstract = {We propose MS-HGNN, a Morphological-Symmetry-Equivariant Heterogeneous Graph Neural Network for robotic dynamics learning, which integrates robotic kinematic structures and morphological symmetries into a unified graph network. By embedding these structural priors as inductive biases, MS-HGNN ensures high generalizability, sample and model efficiency. This architecture is versatile and broadly applicable to various multi-body dynamic systems and dynamics learning tasks. We prove the morphological-symmetry-equivariant property of MS-HGNN and demonstrate its effectiveness across multiple quadruped robot dynamics learning problems using real-world and simulated data. Our code is available at https://github.com/lunarlab-gatech/MorphSym-HGNN/.} }
Endnote
%0 Conference Paper %T Morphological-Symmetry-Equivariant Heterogeneous Graph Neural Network for Robotic Dynamics Learning %A Fengze Xie %A Sizhe Wei %A Yue Song %A Yisong Yue %A Lu Gan %B Proceedings of the 7th Annual Learning for Dynamics \& Control Conference %C Proceedings of Machine Learning Research %D 2025 %E Necmiye Ozay %E Laura Balzano %E Dimitra Panagou %E Alessandro Abate %F pmlr-v283-xie25a %I PMLR %P 1392--1405 %U https://proceedings.mlr.press/v283/xie25a.html %V 283 %X We propose MS-HGNN, a Morphological-Symmetry-Equivariant Heterogeneous Graph Neural Network for robotic dynamics learning, which integrates robotic kinematic structures and morphological symmetries into a unified graph network. By embedding these structural priors as inductive biases, MS-HGNN ensures high generalizability, sample and model efficiency. This architecture is versatile and broadly applicable to various multi-body dynamic systems and dynamics learning tasks. We prove the morphological-symmetry-equivariant property of MS-HGNN and demonstrate its effectiveness across multiple quadruped robot dynamics learning problems using real-world and simulated data. Our code is available at https://github.com/lunarlab-gatech/MorphSym-HGNN/.
APA
Xie, F., Wei, S., Song, Y., Yue, Y. & Gan, L.. (2025). Morphological-Symmetry-Equivariant Heterogeneous Graph Neural Network for Robotic Dynamics Learning. Proceedings of the 7th Annual Learning for Dynamics \& Control Conference, in Proceedings of Machine Learning Research 283:1392-1405 Available from https://proceedings.mlr.press/v283/xie25a.html.

Related Material