Grasp Representation and Detection with Consistent Path in Robotic Grasping

Lu Chen, Zhuomao Li, Jing Yang, Zhenyu Lu, Peng Wu, Tianhua Chen

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Detecting feasible graspable positions on object is crucial for robotic grasping. Existing methods generally evaluate grasp detection by comparing predicted grasps with limited ground truth data. However, since the labeled ground truth grasps are not exhaustive, this strategy lacks comprehensiveness of grasping features and may miss some feasible grasps. To solve this problem, we enhance grasp representation from isolated rectangles to consistent paths on objects, represented by single or multiple line segments, in this work. A novel grasp detection model is also proposed to predict feasible graspable regions by offering more varied selections, where multi-dimensional attention mechanism is integrated to highlight grasping-specific features. This facilitates automatic search of optimal grasp rectangles from numerous grasp regions as per the physical size of gripper and task-specific requirements. A Grasp Path Dataset using grasp paths to reveal the spatial distribution of viable grasps is constructed for the first time and experimental results taken on benchmark datasets as well as real-world scenarios demonstrate that the proposed grasp path representation can enhance detection accuracy in public datasets and success rates in practical robotic grasping tasks, providing a richer set of grasp candidates.
Original languageEnglish
Article number1119298
Number of pages14
JournalIEEE Transactions on Cognitive and Developmental Systems
Early online date7 Aug 2025
DOIs
Publication statusE-pub ahead of print - 7 Aug 2025

Fingerprint

Dive into the research topics of 'Grasp Representation and Detection with Consistent Path in Robotic Grasping'. Together they form a unique fingerprint.

Cite this