NOD-TAMP: Generalizable Long-Horizon Planning with Neural Object Descriptors

Shuo Cheng, Caelan Reed Garrett, Ajay Mandlekar, Danfei Xu
Proceedings of The 8th Conference on Robot Learning, PMLR 270:1310-1339, 2025.

Abstract

Solving complex manipulation tasks in household and factory settings remains challenging due to long-horizon reasoning, fine-grained interactions, and broad object and scene diversity. Learning skills from demonstrations can be an effective strategy, but such methods often have limited generalizability beyond training data and struggle to solve long-horizon tasks. To overcome this, we propose to synergistically combine two paradigms: Neural Object Descriptors (NODs) that produce generalizable object-centric features and Task and Motion Planning (TAMP) frameworks that chain short-horizon skills to solve multi-step tasks. We introduce NOD-TAMP, a TAMP-based framework that extracts short manipulation trajectories from a handful of human demonstrations, adapts these trajectories using NOD features, and composes them to solve broad long-horizon, contact-rich tasks. NOD-TAMP solves existing manipulation benchmarks with a handful of demonstrations and significantly outperforms prior NOD-based approaches on new tabletop manipulation tasks that require diverse generalization. Finally, we deploy NOD-TAMP on a number of real-world tasks, including tool-use and high-precision insertion. For more details, please visit https://nodtamp.github.io/.

Cite this Paper


BibTeX
@InProceedings{pmlr-v270-cheng25a, title = {NOD-TAMP: Generalizable Long-Horizon Planning with Neural Object Descriptors}, author = {Cheng, Shuo and Garrett, Caelan Reed and Mandlekar, Ajay and Xu, Danfei}, booktitle = {Proceedings of The 8th Conference on Robot Learning}, pages = {1310--1339}, year = {2025}, editor = {Agrawal, Pulkit and Kroemer, Oliver and Burgard, Wolfram}, volume = {270}, series = {Proceedings of Machine Learning Research}, month = {06--09 Nov}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v270/main/assets/cheng25a/cheng25a.pdf}, url = {https://proceedings.mlr.press/v270/cheng25a.html}, abstract = {Solving complex manipulation tasks in household and factory settings remains challenging due to long-horizon reasoning, fine-grained interactions, and broad object and scene diversity. Learning skills from demonstrations can be an effective strategy, but such methods often have limited generalizability beyond training data and struggle to solve long-horizon tasks. To overcome this, we propose to synergistically combine two paradigms: Neural Object Descriptors (NODs) that produce generalizable object-centric features and Task and Motion Planning (TAMP) frameworks that chain short-horizon skills to solve multi-step tasks. We introduce NOD-TAMP, a TAMP-based framework that extracts short manipulation trajectories from a handful of human demonstrations, adapts these trajectories using NOD features, and composes them to solve broad long-horizon, contact-rich tasks. NOD-TAMP solves existing manipulation benchmarks with a handful of demonstrations and significantly outperforms prior NOD-based approaches on new tabletop manipulation tasks that require diverse generalization. Finally, we deploy NOD-TAMP on a number of real-world tasks, including tool-use and high-precision insertion. For more details, please visit https://nodtamp.github.io/.} }
Endnote
%0 Conference Paper %T NOD-TAMP: Generalizable Long-Horizon Planning with Neural Object Descriptors %A Shuo Cheng %A Caelan Reed Garrett %A Ajay Mandlekar %A Danfei Xu %B Proceedings of The 8th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2025 %E Pulkit Agrawal %E Oliver Kroemer %E Wolfram Burgard %F pmlr-v270-cheng25a %I PMLR %P 1310--1339 %U https://proceedings.mlr.press/v270/cheng25a.html %V 270 %X Solving complex manipulation tasks in household and factory settings remains challenging due to long-horizon reasoning, fine-grained interactions, and broad object and scene diversity. Learning skills from demonstrations can be an effective strategy, but such methods often have limited generalizability beyond training data and struggle to solve long-horizon tasks. To overcome this, we propose to synergistically combine two paradigms: Neural Object Descriptors (NODs) that produce generalizable object-centric features and Task and Motion Planning (TAMP) frameworks that chain short-horizon skills to solve multi-step tasks. We introduce NOD-TAMP, a TAMP-based framework that extracts short manipulation trajectories from a handful of human demonstrations, adapts these trajectories using NOD features, and composes them to solve broad long-horizon, contact-rich tasks. NOD-TAMP solves existing manipulation benchmarks with a handful of demonstrations and significantly outperforms prior NOD-based approaches on new tabletop manipulation tasks that require diverse generalization. Finally, we deploy NOD-TAMP on a number of real-world tasks, including tool-use and high-precision insertion. For more details, please visit https://nodtamp.github.io/.
APA
Cheng, S., Garrett, C.R., Mandlekar, A. & Xu, D.. (2025). NOD-TAMP: Generalizable Long-Horizon Planning with Neural Object Descriptors. Proceedings of The 8th Conference on Robot Learning, in Proceedings of Machine Learning Research 270:1310-1339 Available from https://proceedings.mlr.press/v270/cheng25a.html.

Related Material