Visuotactile Affordances for Cloth Manipulation with Local Control

Neha Sunil, Shaoxiong Wang, Yu She, Edward Adelson, Alberto Rodriguez Garcia
Proceedings of The 6th Conference on Robot Learning, PMLR 205:1596-1606, 2023.

Abstract

Cloth in the real world is often crumpled, self-occluded, or folded in on itself such that key regions, such as corners, are not directly graspable, making manipulation difficult. We propose a system that leverages visual and tactile perception to unfold the cloth via grasping and sliding on edges. Doing so, the robot is able to grasp two adjacent corners, enabling subsequent manipulation tasks like folding or hanging. We develop tactile perception networks that classify whether an edge is grasped and estimate the pose of the edge. We use the edge classification network to supervise a visuotactile edge grasp affordance network that can grasp edges with a 90% success rate. Once an edge is grasped, we demonstrate that the robot can slide along the cloth to the adjacent corner using tactile pose estimation/control in real time.

Cite this Paper


BibTeX
@InProceedings{pmlr-v205-sunil23a, title = {Visuotactile Affordances for Cloth Manipulation with Local Control}, author = {Sunil, Neha and Wang, Shaoxiong and She, Yu and Adelson, Edward and Garcia, Alberto Rodriguez}, booktitle = {Proceedings of The 6th Conference on Robot Learning}, pages = {1596--1606}, year = {2023}, editor = {Liu, Karen and Kulic, Dana and Ichnowski, Jeff}, volume = {205}, series = {Proceedings of Machine Learning Research}, month = {14--18 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v205/sunil23a/sunil23a.pdf}, url = {https://proceedings.mlr.press/v205/sunil23a.html}, abstract = {Cloth in the real world is often crumpled, self-occluded, or folded in on itself such that key regions, such as corners, are not directly graspable, making manipulation difficult. We propose a system that leverages visual and tactile perception to unfold the cloth via grasping and sliding on edges. Doing so, the robot is able to grasp two adjacent corners, enabling subsequent manipulation tasks like folding or hanging. We develop tactile perception networks that classify whether an edge is grasped and estimate the pose of the edge. We use the edge classification network to supervise a visuotactile edge grasp affordance network that can grasp edges with a 90% success rate. Once an edge is grasped, we demonstrate that the robot can slide along the cloth to the adjacent corner using tactile pose estimation/control in real time.} }
Endnote
%0 Conference Paper %T Visuotactile Affordances for Cloth Manipulation with Local Control %A Neha Sunil %A Shaoxiong Wang %A Yu She %A Edward Adelson %A Alberto Rodriguez Garcia %B Proceedings of The 6th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2023 %E Karen Liu %E Dana Kulic %E Jeff Ichnowski %F pmlr-v205-sunil23a %I PMLR %P 1596--1606 %U https://proceedings.mlr.press/v205/sunil23a.html %V 205 %X Cloth in the real world is often crumpled, self-occluded, or folded in on itself such that key regions, such as corners, are not directly graspable, making manipulation difficult. We propose a system that leverages visual and tactile perception to unfold the cloth via grasping and sliding on edges. Doing so, the robot is able to grasp two adjacent corners, enabling subsequent manipulation tasks like folding or hanging. We develop tactile perception networks that classify whether an edge is grasped and estimate the pose of the edge. We use the edge classification network to supervise a visuotactile edge grasp affordance network that can grasp edges with a 90% success rate. Once an edge is grasped, we demonstrate that the robot can slide along the cloth to the adjacent corner using tactile pose estimation/control in real time.
APA
Sunil, N., Wang, S., She, Y., Adelson, E. & Garcia, A.R.. (2023). Visuotactile Affordances for Cloth Manipulation with Local Control. Proceedings of The 6th Conference on Robot Learning, in Proceedings of Machine Learning Research 205:1596-1606 Available from https://proceedings.mlr.press/v205/sunil23a.html.

Related Material