Robot grasping in dense clutter via view-based experience transfer

Abstract

To perform object grasping in dense clutter, we propose a novel algorithm for grasp detection. To obtain grasp candidates, we developed instance segmentation and view-based experience transfer as part of the algorithm. Subsequently, we established an algorithm for collision avoidance and stability analysis to determine the optimal grasp for robot grasping. The strategy for the view-based experience transfer was to first find the object view and then transfer the grasp experience onto the clutter scenario. This strategy has two advantages over existing learning-based methods for finding grasp candidates. (1) our approach can effectively exclude the influence of noise or occlusion on images and precisely detect grasps that are well aligned on each target object. (2) our approach can efficiently find out optimal grasps on each target object and has the flexibility of adjusting and redefining the grasp experience based on the type of target object. We evaluated our approach using some open-source datasets and with a real-world robot experiment, which involved a six-axis robot arm with a two-jaw parallel gripper and a Kinect V2 RGB-D camera. The experimental results show that our proposed approach can be generalized to objects with complex shape, and is able to grasp on dense clutter scenarios where different types of objects are in a bin. To demonstrate our grasping pipeline, a video is provided at https://youtu.be/gQ3SO6vtTpA.

Publication
In 2021 International Journal of Intelligent Robotics and Applications
Jen-Wei Wang
Jen-Wei Wang
Ph.D. Candidate