Leveraging 3D Reconstruction for Mechanical Search on Cluttered Shelves

Seungyeon Kim, Young Hun Kim, Yonghyeon Lee, Frank C. Park
Proceedings of The 7th Conference on Robot Learning, PMLR 229:822-848, 2023.

Abstract

Finding and grasping a target object on a cluttered shelf, especially when the target is occluded by other unknown objects and initially invisible, remains a significant challenge in robotic manipulation. While there have been advances in finding the target object by rearranging surrounding objects using specialized tools, developing algorithms that work with standard robot grippers remains an unresolved issue. In this paper, we introduce a novel framework for finding and grasping the target object using a standard gripper, employing pushing and pick and-place actions. To achieve this, we introduce two indicator functions: (i) an existence function, determining the potential presence of the target, and (ii) a graspability function, assessing the feasibility of grasping the identified target. We then formulate a model-based optimal control problem. The core component of our approach involves leveraging a 3D recognition model, enabling efficient estimation of the proposed indicator functions and their associated dynamics models. Our method succeeds in finding and grasping the target object using a standard robot gripper in both simulations and real-world settings. In particular, we demonstrate the adaptability and robustness of our method in the presence of noise in real-world vision sensor data. The code for our framework is available at https://github.com/seungyeon-k/Search-for-Grasp-public.

Cite this Paper


BibTeX
@InProceedings{pmlr-v229-kim23a, title = {Leveraging 3D Reconstruction for Mechanical Search on Cluttered Shelves}, author = {Kim, Seungyeon and Kim, Young Hun and Lee, Yonghyeon and Park, Frank C.}, booktitle = {Proceedings of The 7th Conference on Robot Learning}, pages = {822--848}, year = {2023}, editor = {Tan, Jie and Toussaint, Marc and Darvish, Kourosh}, volume = {229}, series = {Proceedings of Machine Learning Research}, month = {06--09 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v229/kim23a/kim23a.pdf}, url = {https://proceedings.mlr.press/v229/kim23a.html}, abstract = {Finding and grasping a target object on a cluttered shelf, especially when the target is occluded by other unknown objects and initially invisible, remains a significant challenge in robotic manipulation. While there have been advances in finding the target object by rearranging surrounding objects using specialized tools, developing algorithms that work with standard robot grippers remains an unresolved issue. In this paper, we introduce a novel framework for finding and grasping the target object using a standard gripper, employing pushing and pick and-place actions. To achieve this, we introduce two indicator functions: (i) an existence function, determining the potential presence of the target, and (ii) a graspability function, assessing the feasibility of grasping the identified target. We then formulate a model-based optimal control problem. The core component of our approach involves leveraging a 3D recognition model, enabling efficient estimation of the proposed indicator functions and their associated dynamics models. Our method succeeds in finding and grasping the target object using a standard robot gripper in both simulations and real-world settings. In particular, we demonstrate the adaptability and robustness of our method in the presence of noise in real-world vision sensor data. The code for our framework is available at https://github.com/seungyeon-k/Search-for-Grasp-public.} }
Endnote
%0 Conference Paper %T Leveraging 3D Reconstruction for Mechanical Search on Cluttered Shelves %A Seungyeon Kim %A Young Hun Kim %A Yonghyeon Lee %A Frank C. Park %B Proceedings of The 7th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2023 %E Jie Tan %E Marc Toussaint %E Kourosh Darvish %F pmlr-v229-kim23a %I PMLR %P 822--848 %U https://proceedings.mlr.press/v229/kim23a.html %V 229 %X Finding and grasping a target object on a cluttered shelf, especially when the target is occluded by other unknown objects and initially invisible, remains a significant challenge in robotic manipulation. While there have been advances in finding the target object by rearranging surrounding objects using specialized tools, developing algorithms that work with standard robot grippers remains an unresolved issue. In this paper, we introduce a novel framework for finding and grasping the target object using a standard gripper, employing pushing and pick and-place actions. To achieve this, we introduce two indicator functions: (i) an existence function, determining the potential presence of the target, and (ii) a graspability function, assessing the feasibility of grasping the identified target. We then formulate a model-based optimal control problem. The core component of our approach involves leveraging a 3D recognition model, enabling efficient estimation of the proposed indicator functions and their associated dynamics models. Our method succeeds in finding and grasping the target object using a standard robot gripper in both simulations and real-world settings. In particular, we demonstrate the adaptability and robustness of our method in the presence of noise in real-world vision sensor data. The code for our framework is available at https://github.com/seungyeon-k/Search-for-Grasp-public.
APA
Kim, S., Kim, Y.H., Lee, Y. & Park, F.C.. (2023). Leveraging 3D Reconstruction for Mechanical Search on Cluttered Shelves. Proceedings of The 7th Conference on Robot Learning, in Proceedings of Machine Learning Research 229:822-848 Available from https://proceedings.mlr.press/v229/kim23a.html.

Related Material