GraNet
[IROS 2023] GraNet: A Multi-Level Graph Network for 6-DoF Grasp Pose Generation in Cluttered Scenes
Install / Use
/learn @wang-h-w/GraNetREADME
GraNet
Official implementation for paper "GraNet: A Multi-Level Graph Network for 6-DoF Grasp Pose Generation in Cluttered Scenes" in IROS 2023.

Abstract
6-DoF object-agnostic grasping in unstructured environments is a critical yet challenging task in robotics. Most current works use non-optimized approaches to sample grasp locations and learn spatial features without concerning the grasping task. This paper proposes GraNet, a graph-based grasp pose generation framework that translates a point cloud scene into multi-level graphs and propagates features through graph neural networks. By building graphs at the scene level, object level, and grasp point level, GraNet enhances feature embedding at multiple scales while progressively converging to the ideal grasping locations by learning. Our pipeline can thus characterize the spatial distribution of grasps in cluttered scenes, leading to a higher rate of effective grasping. Furthermore, we enhance the representation ability of scalable graph networks by a structure-aware attention mechanism to exploit local relations in graphs. Our method achieves state-of-the-art performance on the large-scale GraspNet-1Billion benchmark, especially in grasping unseen objects (+11.62 AP). The real robot experiment shows a high success rate in grasping scattered objects, verifying the effectiveness of the proposed approach in unstructured environments.
Requirements
- Python 3.7
- PyTorch 1.7.1 + cuda 11.0.211
- Dgl 0.7.2
- Open3d 0.14.1
- Tensorboard 2.8.0
- Numpy 1.21.2
- Scipy 1.7.3
- Pillow 8.4.0
- Tqdm 4.62.3
Installation
-
Clone our repository.
git clone https://github.com/wang-h-w/GraNet.git cd GraNet -
Install all dependencies mentioned in Requirements.
-
Compile and install pointnet2 operators.
cd pointnet2 python setup.py install -
Compile and install knn operators.
cd knn python setup.py install -
Install GraspNetAPI.
git clone https://github.com/graspnet/graspnetAPI.git cd graspnetAPI pip install .
Data Preparation
Download GraspNet-1Billion dataset
GraNet is trained and tested on GraspNet-1Billion dataset. You can download the dataset from here. You need to download the data from the following category on this page: Train Images, Test Images, 6 DoF Grasp Labels, Object 3D Models. All downloaded files are placed in the DATASET_PATH directory, which needs to be indicated for subsequent training and testing to find the data.
Generate label for GPS network learning
In GraNet, the GPS network (see the paper for details) is optimized to find out the high-value grasp points. From the training point of view, we need to provide the ground-truth grasping value of each point. Due to the limitation of computing power, we currently sample points in the scene and generate a dataset of the grasping values of these points. Running the following command will generate the objectness_score folder in the DATASET_PATH directory, where the grasping value data of each point is stored.
cd dataset
sh generate_score.sh
Note: --points refers to the number of points that generate grasping value, --start and --end specify the start and end scene for generating the label, which should be in the range 0-190 according to GraspNet-1Billion.
Usage
Training
sh command_train.sh
Testing
sh command_test.sh
Demo
sh command_demo.sh
Note:
The structure of the code in this repository has been modified for easier reading. During the experimental phase, we tested based on the raw unorganized code, so the checkpoints can only be used for that version. You can download the raw code with pretrained weights from: [OneDrive] [Baidu Pan].
Citation
If you find our work useful in your research, please consider citing:
@inproceedings{wang2023granet,
title={GraNet: A Multi-Level Graph Network for 6-DoF Grasp Pose Generation in Cluttered Scenes},
author={Wang, Haowen and Niu, Wanhao and Zhuang, Chungang},
booktitle={2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
pages={937--943},
year={2023},
organization={IEEE}
}
