Official repository for the paper "Uncertainty-aware Suction Grasping for Cluttered Scenes"
Download data and labels from SuctionNet webpage.
The code has been tested with CUDA 11.6
and pytorch 1.13.0
on ubuntu 20.04
Create new enviornment:
conda create --name grasp python=3.8
Activate the enviornment and install Pytorch 1.13.0:
conda install pytorch==1.13.1 torchvision==0.14.1 torchaudio==0.13.1 pytorch-cuda=11.6 -c pytorch -c nvidia
Install Minkowski Engine:
git clone https://github.com/NVIDIA/MinkowskiEngine.git
cd MinkowskiEngine
python setup.py install --blas_include_dirs=${CONDA_PREFIX}/include --blas=openblas
Install prerequisites:
pip install -r requirements.txt
Install suctionnetAPI:
git clone https://github.com/intrepidChw/suctionnms.git
cd suctionnms
pip install .
git clone https://github.com/graspnet/suctionnetAPI
cd suctionnetAPI
pip install .
- Precompute normal map for scenes:
cd dataset
python generate_normal_data.py --dataset_root '/path/to/SuctionNet/dataset'
- Precompute suction label for scenes:
cd dataset
python generate_suction_data.py --dataset_root '/path/to/SuctionNet/dataset'
For training, use the following command:
bash scripts/train.sh
For evaluation, use the following command, where 'xxxx' denotes splits:'seen', 'similar' or 'novel':
bash scripts/eval_xxxx.sh
The pre-trained models for GraspNet dataset can be found here.
if you find our work useful, please cite
@ARTICLE{USIN_grasp,
author={Cao, Rui and Yang, Biqi and Li, Yichuan and Fu, Chi-Wing and Heng, Pheng-Ann and Liu, Yun-Hui},
journal={IEEE Robotics and Automation Letters},
title={Uncertainty-Aware Suction Grasping for Cluttered Scenes},
year={2024},
volume={9},
number={6},
pages={4934-4941},
keywords={Grasping;Uncertainty;Point cloud compression;Robots;Noise measurement;Three-dimensional displays;Predictive models;Deep learning in grasping and manipulation;perception for grasping and manipulation;computer vision for automation},
doi={10.1109/LRA.2024.3385609}}
If you have any questions about this work, feel free to contact Rui Cao at [email protected]