Bayesian Object Models for Robotic Interaction with Differentiable Probabilistic Programming

Krishna Murthy Jatavallabhula, Miles Macklin, Dieter Fox, Animesh Garg, Fabio Ramos
Proceedings of The 6th Conference on Robot Learning, PMLR 205:1563-1574, 2023.

Abstract

A hallmark of human intelligence is the ability to build rich mental models of previously unseen objects from very few interactions. To achieve true, continuous autonomy, robots too must possess this ability. Importantly, to integrate with the probabilistic robotics software stack, such models must encapsulate the uncertainty (resulting from noisy dynamics and observation models) in a prescriptive manner. We present Bayesian Object Models (BOMs): generative (probabilistic) models that encode both the structural and kinodynamic attributes of an object. BOMs are implemented in the form of a differentiable probabilistic program that models latent scene structure, object dynamics, and observation models. This allows for efficient and automated Bayesian inference – samples (object trajectories) drawn from the BOM are compared with a small set of real-world observations and used to compute a likelihood function. Our model comprises a differentiable tree structure sampler and a differentiable physics engine, enabling gradient computation through this likelihood function. This enables gradient-based Bayesian inference to efficiently update the distributional parameters of our model. BOMs outperform several recent approaches, including differentiable physics-based, gradient-free, and neural inference schemes. Further information at: https://bayesianobjects.github.io/

Cite this Paper


BibTeX
@InProceedings{pmlr-v205-jatavallabhula23a, title = {Bayesian Object Models for Robotic Interaction with Differentiable Probabilistic Programming}, author = {Jatavallabhula, Krishna Murthy and Macklin, Miles and Fox, Dieter and Garg, Animesh and Ramos, Fabio}, booktitle = {Proceedings of The 6th Conference on Robot Learning}, pages = {1563--1574}, year = {2023}, editor = {Liu, Karen and Kulic, Dana and Ichnowski, Jeff}, volume = {205}, series = {Proceedings of Machine Learning Research}, month = {14--18 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v205/jatavallabhula23a/jatavallabhula23a.pdf}, url = {https://proceedings.mlr.press/v205/jatavallabhula23a.html}, abstract = {A hallmark of human intelligence is the ability to build rich mental models of previously unseen objects from very few interactions. To achieve true, continuous autonomy, robots too must possess this ability. Importantly, to integrate with the probabilistic robotics software stack, such models must encapsulate the uncertainty (resulting from noisy dynamics and observation models) in a prescriptive manner. We present Bayesian Object Models (BOMs): generative (probabilistic) models that encode both the structural and kinodynamic attributes of an object. BOMs are implemented in the form of a differentiable probabilistic program that models latent scene structure, object dynamics, and observation models. This allows for efficient and automated Bayesian inference – samples (object trajectories) drawn from the BOM are compared with a small set of real-world observations and used to compute a likelihood function. Our model comprises a differentiable tree structure sampler and a differentiable physics engine, enabling gradient computation through this likelihood function. This enables gradient-based Bayesian inference to efficiently update the distributional parameters of our model. BOMs outperform several recent approaches, including differentiable physics-based, gradient-free, and neural inference schemes. Further information at: https://bayesianobjects.github.io/} }
Endnote
%0 Conference Paper %T Bayesian Object Models for Robotic Interaction with Differentiable Probabilistic Programming %A Krishna Murthy Jatavallabhula %A Miles Macklin %A Dieter Fox %A Animesh Garg %A Fabio Ramos %B Proceedings of The 6th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2023 %E Karen Liu %E Dana Kulic %E Jeff Ichnowski %F pmlr-v205-jatavallabhula23a %I PMLR %P 1563--1574 %U https://proceedings.mlr.press/v205/jatavallabhula23a.html %V 205 %X A hallmark of human intelligence is the ability to build rich mental models of previously unseen objects from very few interactions. To achieve true, continuous autonomy, robots too must possess this ability. Importantly, to integrate with the probabilistic robotics software stack, such models must encapsulate the uncertainty (resulting from noisy dynamics and observation models) in a prescriptive manner. We present Bayesian Object Models (BOMs): generative (probabilistic) models that encode both the structural and kinodynamic attributes of an object. BOMs are implemented in the form of a differentiable probabilistic program that models latent scene structure, object dynamics, and observation models. This allows for efficient and automated Bayesian inference – samples (object trajectories) drawn from the BOM are compared with a small set of real-world observations and used to compute a likelihood function. Our model comprises a differentiable tree structure sampler and a differentiable physics engine, enabling gradient computation through this likelihood function. This enables gradient-based Bayesian inference to efficiently update the distributional parameters of our model. BOMs outperform several recent approaches, including differentiable physics-based, gradient-free, and neural inference schemes. Further information at: https://bayesianobjects.github.io/
APA
Jatavallabhula, K.M., Macklin, M., Fox, D., Garg, A. & Ramos, F.. (2023). Bayesian Object Models for Robotic Interaction with Differentiable Probabilistic Programming. Proceedings of The 6th Conference on Robot Learning, in Proceedings of Machine Learning Research 205:1563-1574 Available from https://proceedings.mlr.press/v205/jatavallabhula23a.html.

Related Material