Scalable Differentiable Physics for Learning and Control

Yi-Ling Qiao, Junbang Liang, Vladlen Koltun, Ming Lin
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:7847-7856, 2020.

Abstract

Differentiable physics is a powerful approach to learning and control problems that involve physical objects and environments. While notable progress has been made, the capabilities of differentiable physics solvers remain limited. We develop a scalable framework for differentiable physics that can support a large number of objects and their interactions. To accommodate objects with arbitrary geometry and topology, we adopt meshes as our representation and leverage the sparsity of contacts for scalable differentiable collision handling. Collisions are resolved in localized regions to minimize the number of optimization variables even when the number of simulated objects is high. We further accelerate implicit differentiation of optimization with nonlinear constraints. Experiments demonstrate that the presented framework requires up to two orders of magnitude less memory and computation in comparison to recent particle-based methods. We further validate the approach on inverse problems and control scenarios, where it outperforms derivative-free and model-free baselines by at least an order of magnitude.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-qiao20a, title = {Scalable Differentiable Physics for Learning and Control}, author = {Qiao, Yi-Ling and Liang, Junbang and Koltun, Vladlen and Lin, Ming}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {7847--7856}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/qiao20a/qiao20a.pdf}, url = {https://proceedings.mlr.press/v119/qiao20a.html}, abstract = {Differentiable physics is a powerful approach to learning and control problems that involve physical objects and environments. While notable progress has been made, the capabilities of differentiable physics solvers remain limited. We develop a scalable framework for differentiable physics that can support a large number of objects and their interactions. To accommodate objects with arbitrary geometry and topology, we adopt meshes as our representation and leverage the sparsity of contacts for scalable differentiable collision handling. Collisions are resolved in localized regions to minimize the number of optimization variables even when the number of simulated objects is high. We further accelerate implicit differentiation of optimization with nonlinear constraints. Experiments demonstrate that the presented framework requires up to two orders of magnitude less memory and computation in comparison to recent particle-based methods. We further validate the approach on inverse problems and control scenarios, where it outperforms derivative-free and model-free baselines by at least an order of magnitude.} }
Endnote
%0 Conference Paper %T Scalable Differentiable Physics for Learning and Control %A Yi-Ling Qiao %A Junbang Liang %A Vladlen Koltun %A Ming Lin %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-qiao20a %I PMLR %P 7847--7856 %U https://proceedings.mlr.press/v119/qiao20a.html %V 119 %X Differentiable physics is a powerful approach to learning and control problems that involve physical objects and environments. While notable progress has been made, the capabilities of differentiable physics solvers remain limited. We develop a scalable framework for differentiable physics that can support a large number of objects and their interactions. To accommodate objects with arbitrary geometry and topology, we adopt meshes as our representation and leverage the sparsity of contacts for scalable differentiable collision handling. Collisions are resolved in localized regions to minimize the number of optimization variables even when the number of simulated objects is high. We further accelerate implicit differentiation of optimization with nonlinear constraints. Experiments demonstrate that the presented framework requires up to two orders of magnitude less memory and computation in comparison to recent particle-based methods. We further validate the approach on inverse problems and control scenarios, where it outperforms derivative-free and model-free baselines by at least an order of magnitude.
APA
Qiao, Y., Liang, J., Koltun, V. & Lin, M.. (2020). Scalable Differentiable Physics for Learning and Control. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:7847-7856 Available from https://proceedings.mlr.press/v119/qiao20a.html.

Related Material