DexRepNet: Learning Dexterous Robotic Grasping Network with Geometric and Spatial Hand-Object Representation



Qingtao Liu1*
Yu Cui1*
Qi Ye1✝
Zhengnan Sun1
Haoming Li1
Gaofeng Li1
Lin Shao2
Jiming Chen1

1Zhejiang University
2National University of Singapore

[Paper]
[Video]
[Code]


Abstract

Robotic dexterous grasping is a challenging prob-lem due to the high degree of freedom (DoF) and complexcontacts of multi-fingered robotic hands. Existing deep re-inforcement learning (DRL) based methods leverage humandemonstrations to reduce sample complexity due to the highdimensional action space with dexterous grasping. However,less attention has been paid to hand-object interaction rep-resentations for high-level generalization. In this paper, wepropose a novel geometric and spatial hand-object interactionrepresentation, named DexRep, to capture dynamic objectshape features and the spatial relations between hands andobjects during grasping. DexRep comprises Occupancy Featurefor rough shapes within sensing range by moving hands, SurfaceFeature for changing hand-object surface distances, and Local-Geo Feature for local geometric surface features most related topotential contacts. Based on the new representation, we proposea dexterous deep reinforcement learning method to learn ageneralizable grasping policy DexRepNet. Experimental resultsshow that our method outperforms baselines using existingrepresentations for robotic grasping dramatically both in graspsuccess rate and convergence speed. It achieves a 93% graspingsuccess rate on seen objects and higher than 80% graspingsuccess rates on diverse objects of unseen categories in bothsimulation and real-world experiments.


Method


Video


BibTeX

@inproceedings{liu2023dexrepnet,
title={Dexrepnet: Learning dexterous robotic grasping network with geometric and spatial hand-object representations},
author={Liu, Qingtao and Cui, Yu and Ye, Qi and Sun, Zhengnan and Li, Haoming and Li, Gaofeng and Shao, Lin and Chen, Jiming},
booktitle={2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
pages={3153--3160},
year={2023},
organization={IEEE}
}

Contact: Qingtao Liu, Qi Ye