FabricFlowNet: Bimanual Cloth Manipulation with a Flow-based Policy

Thomas Weng, Sujay Man Bajracharya, Yufei Wang, Khush Agrawal, David Held
Proceedings of the 5th Conference on Robot Learning, PMLR 164:192-202, 2022.

Abstract

We address the problem of goal-directed cloth manipulation, a challenging task due to the deformability of cloth. Our insight is that optical flow, a technique normally used for motion estimation in video, can also provide an effective representation for corresponding cloth poses across observation and goal images. We introduce FabricFlowNet (FFN), a cloth manipulation policy that leverages flow as both an input and as an action representation to improve performance. FabricFlowNet also elegantly switches between bimanual and single-arm actions based on the desired goal. We show that FabricFlowNet significantly outperforms state-of-the-art model-free and model-based cloth manipulation policies that take image input. We also present real-world experiments on a bimanual system, demonstrating effective sim-to-real transfer. Finally, we show that our method generalizes when trained on a single square cloth to other cloth shapes, such as T-shirts and rectangular cloths. Video and other supplementary materials are available at: https://sites.google.com/view/fabricflownet.

Cite this Paper


BibTeX
@InProceedings{pmlr-v164-weng22a, title = {FabricFlowNet: Bimanual Cloth Manipulation with a Flow-based Policy}, author = {Weng, Thomas and Bajracharya, Sujay Man and Wang, Yufei and Agrawal, Khush and Held, David}, booktitle = {Proceedings of the 5th Conference on Robot Learning}, pages = {192--202}, year = {2022}, editor = {Faust, Aleksandra and Hsu, David and Neumann, Gerhard}, volume = {164}, series = {Proceedings of Machine Learning Research}, month = {08--11 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v164/weng22a/weng22a.pdf}, url = {https://proceedings.mlr.press/v164/weng22a.html}, abstract = {We address the problem of goal-directed cloth manipulation, a challenging task due to the deformability of cloth. Our insight is that optical flow, a technique normally used for motion estimation in video, can also provide an effective representation for corresponding cloth poses across observation and goal images. We introduce FabricFlowNet (FFN), a cloth manipulation policy that leverages flow as both an input and as an action representation to improve performance. FabricFlowNet also elegantly switches between bimanual and single-arm actions based on the desired goal. We show that FabricFlowNet significantly outperforms state-of-the-art model-free and model-based cloth manipulation policies that take image input. We also present real-world experiments on a bimanual system, demonstrating effective sim-to-real transfer. Finally, we show that our method generalizes when trained on a single square cloth to other cloth shapes, such as T-shirts and rectangular cloths. Video and other supplementary materials are available at: https://sites.google.com/view/fabricflownet.} }
Endnote
%0 Conference Paper %T FabricFlowNet: Bimanual Cloth Manipulation with a Flow-based Policy %A Thomas Weng %A Sujay Man Bajracharya %A Yufei Wang %A Khush Agrawal %A David Held %B Proceedings of the 5th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2022 %E Aleksandra Faust %E David Hsu %E Gerhard Neumann %F pmlr-v164-weng22a %I PMLR %P 192--202 %U https://proceedings.mlr.press/v164/weng22a.html %V 164 %X We address the problem of goal-directed cloth manipulation, a challenging task due to the deformability of cloth. Our insight is that optical flow, a technique normally used for motion estimation in video, can also provide an effective representation for corresponding cloth poses across observation and goal images. We introduce FabricFlowNet (FFN), a cloth manipulation policy that leverages flow as both an input and as an action representation to improve performance. FabricFlowNet also elegantly switches between bimanual and single-arm actions based on the desired goal. We show that FabricFlowNet significantly outperforms state-of-the-art model-free and model-based cloth manipulation policies that take image input. We also present real-world experiments on a bimanual system, demonstrating effective sim-to-real transfer. Finally, we show that our method generalizes when trained on a single square cloth to other cloth shapes, such as T-shirts and rectangular cloths. Video and other supplementary materials are available at: https://sites.google.com/view/fabricflownet.
APA
Weng, T., Bajracharya, S.M., Wang, Y., Agrawal, K. & Held, D.. (2022). FabricFlowNet: Bimanual Cloth Manipulation with a Flow-based Policy. Proceedings of the 5th Conference on Robot Learning, in Proceedings of Machine Learning Research 164:192-202 Available from https://proceedings.mlr.press/v164/weng22a.html.

Related Material