In-Hand Gravitational Pivoting Using Tactile Sensing

Jason Toskov, Rhys Newbury, Mustafa Mukadam, Dana Kulic, Akansel Cosgun
Proceedings of The 6th Conference on Robot Learning, PMLR 205:2284-2293, 2023.

Abstract

We study gravitational pivoting, a constrained version of in-hand manipulation, where we aim to control the rotation of an object around the grip point of a parallel gripper. To achieve this, instead of controlling the gripper to avoid slip, we \emph{embrace slip} to allow the object to rotate in-hand. We collect two real-world datasets, a static tracking dataset and a controller-in-the-loop dataset, both annotated with object angle and angular velocity labels. Both datasets contain force-based tactile information on ten different household objects. We train an LSTM model to predict the angular position and velocity of the held object from purely tactile data. We integrate this model with a controller that opens and closes the gripper allowing the object to rotate to desired relative angles. We conduct real-world experiments where the robot is tasked to achieve a relative target angle. We show that our approach outperforms a sliding-window based MLP in a zero-shot generalization setting with unseen objects. Furthermore, we show a 16.6% improvement in performance when the LSTM model is fine-tuned on a small set of data collected with both the LSTM model and the controller in-the-loop. Code and videos are available at https://rhys-newbury.github.io/projects/pivoting/.

Cite this Paper


BibTeX
@InProceedings{pmlr-v205-toskov23a, title = {In-Hand Gravitational Pivoting Using Tactile Sensing}, author = {Toskov, Jason and Newbury, Rhys and Mukadam, Mustafa and Kulic, Dana and Cosgun, Akansel}, booktitle = {Proceedings of The 6th Conference on Robot Learning}, pages = {2284--2293}, year = {2023}, editor = {Liu, Karen and Kulic, Dana and Ichnowski, Jeff}, volume = {205}, series = {Proceedings of Machine Learning Research}, month = {14--18 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v205/toskov23a/toskov23a.pdf}, url = {https://proceedings.mlr.press/v205/toskov23a.html}, abstract = {We study gravitational pivoting, a constrained version of in-hand manipulation, where we aim to control the rotation of an object around the grip point of a parallel gripper. To achieve this, instead of controlling the gripper to avoid slip, we \emph{embrace slip} to allow the object to rotate in-hand. We collect two real-world datasets, a static tracking dataset and a controller-in-the-loop dataset, both annotated with object angle and angular velocity labels. Both datasets contain force-based tactile information on ten different household objects. We train an LSTM model to predict the angular position and velocity of the held object from purely tactile data. We integrate this model with a controller that opens and closes the gripper allowing the object to rotate to desired relative angles. We conduct real-world experiments where the robot is tasked to achieve a relative target angle. We show that our approach outperforms a sliding-window based MLP in a zero-shot generalization setting with unseen objects. Furthermore, we show a 16.6% improvement in performance when the LSTM model is fine-tuned on a small set of data collected with both the LSTM model and the controller in-the-loop. Code and videos are available at https://rhys-newbury.github.io/projects/pivoting/.} }
Endnote
%0 Conference Paper %T In-Hand Gravitational Pivoting Using Tactile Sensing %A Jason Toskov %A Rhys Newbury %A Mustafa Mukadam %A Dana Kulic %A Akansel Cosgun %B Proceedings of The 6th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2023 %E Karen Liu %E Dana Kulic %E Jeff Ichnowski %F pmlr-v205-toskov23a %I PMLR %P 2284--2293 %U https://proceedings.mlr.press/v205/toskov23a.html %V 205 %X We study gravitational pivoting, a constrained version of in-hand manipulation, where we aim to control the rotation of an object around the grip point of a parallel gripper. To achieve this, instead of controlling the gripper to avoid slip, we \emph{embrace slip} to allow the object to rotate in-hand. We collect two real-world datasets, a static tracking dataset and a controller-in-the-loop dataset, both annotated with object angle and angular velocity labels. Both datasets contain force-based tactile information on ten different household objects. We train an LSTM model to predict the angular position and velocity of the held object from purely tactile data. We integrate this model with a controller that opens and closes the gripper allowing the object to rotate to desired relative angles. We conduct real-world experiments where the robot is tasked to achieve a relative target angle. We show that our approach outperforms a sliding-window based MLP in a zero-shot generalization setting with unseen objects. Furthermore, we show a 16.6% improvement in performance when the LSTM model is fine-tuned on a small set of data collected with both the LSTM model and the controller in-the-loop. Code and videos are available at https://rhys-newbury.github.io/projects/pivoting/.
APA
Toskov, J., Newbury, R., Mukadam, M., Kulic, D. & Cosgun, A.. (2023). In-Hand Gravitational Pivoting Using Tactile Sensing. Proceedings of The 6th Conference on Robot Learning, in Proceedings of Machine Learning Research 205:2284-2293 Available from https://proceedings.mlr.press/v205/toskov23a.html.

Related Material