SkillAgentSearch skills...

DexFuncGrasp

DexFuncGrasp: A Robotic Dexterous Functional Grasp Dataset constructed from a Cost-Effective Real-Simulation Annotation System (AAAI2024)

Install / Use

/learn @hjlllll/DexFuncGrasp
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<p align="center"> <h2 align="center">DexFuncGrasp: A Robotic Dexterous Functional Grasp Dataset Constructed from a Cost-Effective Real-Simulation Annotation System (AAAI2024)</h2> <p align="center">Jinglue Hang, Xiangbo Lin&dagger;, Tianqiang Zhu, Xuanheng Li, Rina Wu, Xiaohong Ma and Yi Sun;<br /> Dalian University of Technology<br /> &dagger; corresponding author<br /> <a href='https://hjlllll.github.io/DFG/'>project page</a> </p>

Contents

  1. Abstract

  2. Grasp pose collection

  3. Grasp Transfer for Dataset Extension

  4. DFG Dataset

  5. DexFuncGraspNet

  6. Simulation Experiment

  7. Acknowledgments

  8. Citation

  9. License

Abstract

<div align=center> <img src="pic/dataset.png" width="640px"> </div>

Robot grasp dataset is the basis of designing the robot’s grasp generation model. Compared with the building grasp dataset for Low-DOF grippers, it is harder for High-DOF dexterous robot hand. Most current datasets meet the needs of generating stable grasps, but they are not suitable for dexterous hands to complete human-like functional grasp, such as grasp the handle of a cup or pressing the button of a flashlight, so as to enable robots to complete subsequent functional manipulation action autonomously, and there is no dataset with functional grasp pose annotations at present. This paper develops a unique Cost-Effective Real-Simulation Annotation System by leveraging natural hand’s actions. The system is able to capture a functional grasp of a dexterous hand in a simulated environment assisted by human demonstration in real world. By using this system, dexterous grasp data can be collected efficiently as well as cost-effective. Finally, we construct the first dexterous functional grasp dataset with rich pose annotations. A Functional Grasp Synthesis Model is also provided to validate the effectiveness of the proposed system and dataset.

Download

If you want to complete this work, you can download these: (choose optional)

Enviorment

Three conda env

Grasp pose collection

Cost-effective Annotaion System

<div align=center> <img src="pic/method-pipeline2.png" width="840px"> </div>
  • Our Annotation system: we use TeachNet mapping human hand to ShadowHand and collect functional dexterous hand grasp. Other dexterous hands collection which use directly angle mapping from ShadowHand are also provided.

HardWare

  • follow the realsense website and install realsense
two RGB cameras ===== our frame_shape = [720, 1280]
one realsense camera ==== we use Inter SR305

Dependencies

  • Ubuntu 20.04 (optional)

  • Python 3.8

  • PyTorch 1.10.1

  • Numpy 1.22.0

  • mediapipe 0.8.11

  • pytorch-kinematics 0.3.0

  • Isaac Gym preview 4.0 (3.0)

  • CUDA 11.1

Common Packages

conda create -n annotate python==3.8.13
conda activate annotate

# Install pytorch with cuda
pip install torch==1.10.1 torchvision==0.11.2 ## or using offical code from pytorch website
pip install numpy==1.22.0
cd Annotation/
cd pytorch_kinematics/ #need download from up link
pip install -e.
cd ..
pip install -r requirement.txt

# Install IsaacGym : 
# download from up link and put in to folder Annotation/
cd IsaacGym/python/
pip install -e .
export LD_LIBRARY_PATH=/home/your/path/to/anaconda3/envs/annotate/lib

Process steps

  • Download Isaac Gym preview 4.0 (3.0)

    |-- Annotation
        |-- IsaacGym
    
  • Download Obj_Data

    |-- Annotation
        |-- IsaacGym
            |-- assets
                |-- urdf
                    |-- off
                        |-- Obj_Data
                        |-- Obj_Data_urdf
    
  • Set the cameras in real as shown in the figure.

  • Follow the instruction from handpose3d, get the camera_paremeters folder, or use mine.

  • Create a folder, for example, named /Grasp_Pose.

  • Run .py, which --idx means the id of category, and --instance means which object to be grasped, --cam_1 and --cam_2 means the ids of them:

python shadow_dataset_human_shadow_add_issacgym_system_pytorch3d_mesh_new_dataset.py --idx 0 --instance 0 --cam_1 6 --cam_2 4

Using IsaacGym to verify meanwhile (open an another terminal at the same time).

  • We read the grasp pose file from Grasp_Pose/. and sent to IsaacGym to verify at the same time, success grasps and collected success rate will be saved in dir /Tink_Grasp_Transfer/Dataset/Grasps/.
cd..
cd IsaacGym/python
python grasp_gym_runtime_white_new_data.py --pipeline cpu --grasp_mode dynamic --idx 0 --instance 0
  • If you think this grasp is good grasp, press blank and poses can be saved, try to collect less than 30 grasps, and click x in isaacgym in the top right to close. The grasp pose could be saved in dir Grasp_Pose/.

  • After collection, unit axis for grasps in /Tink_Grasp_Transfer/Dataset/Grasps/ in order to learn sdf function of each category.

python trans_unit.py 
  • Other dexterous hand collection demo (Optional)
python shadow_dataset_human_shadow_add_issacgym_system_pytorch3d_mesh_new_dataset_multi_dexterous.py --idx 0 --instance 0 --cam_1 6 --cam_2 4
  • Visualization
# Put .pkl in to Annotation/visual_dict/new/
python show_data_mesh.py

Grasp Transfer for Dataset Extension

<div align=center> <img src="pic/method-transfer.png" width="740px"> </div>

Dependencies

  • Tink , this part is modified from Tink(OakInk)

  • git clone https://github.com/oakink/DeepSDF_OakInk follow the instruction and install all requirements: The code is in C++ and has the following requirements: (using the same conda env annotate)

  • CLI11

  • Pangolin

  • nanoflann

  • Eigen3.3.9

Common Packages

pip install termcolor
pip install plyfile

### prepare mesh-to-sdf env
git clone https://github.com/marian42/mesh_to_sdf
cd mesh_to_sdf
pip install -e.


pip install scikit-image==0.16.2

put download packages in Transfer/third-party/
cd CLI11 # cd Pangolin/nanofl...
mkdir build
cd build
cmake ..
make -j8 

Process steps

  • The same process using Tink.
cd Tink_Grasp_Transfer/
python generate_sdf.py --idx 0
python train_deep_sdf.py --idx 0
python reconstruct_train.py --idx 0 --mesh_include
python tink/gen_interpolate.py --all --idx 0
python tink/cal_contact_info_shadow.py --idx 0 --tag trans
python tink/info_transform.py --idx 0 --all
python tink/pose_refine.py --idx 0 --all #--vis
  • Or directly bash:
sh transfer.sh
  • Only save the success grasp and unit axis of dataset:
cd ../../../IsaacGym/python/collect_grasp/
# save the success grasp
sh run_clean.sh
# unit axis of dataset
python trans_unit_dataset_func.py

  • You can change the grasp in to folder to make them small size
cd DexFuncGraspNet/Grasps_Dataset
python data_process_m.py
  • Till now, the grasp dataset in folder: Annotation/Tink_Grasp_Transfer/Dataset/Grasps, each grasps used for training in /0_unit_025_mug/sift/unit_mug_s009/new, which object quat are all [1 0 0 0], at same axis.

DFG Dataset

<div align=center> <img src="pic/DexFuncGrasp.png" width="640px"> </div>
  • We collect objects from online dataset such as OakInk, and collect grasps through steps above. we name it DFG dataset.
  • Download source meshes and grasp labels for 12 categories from DFG-Dataset dataset.
  • Arrange the files as follows:
|-- DexFuncGraspNet
    |-- Grasps_Dataset
        |-- train
            |-- 0_unit_025_mug ##labeled objects
                |--unit_mug_s009.npy ##transferred objects
              

Related Skills

View on GitHub
GitHub Stars50
CategoryDevelopment
Updated18d ago
Forks7

Languages

Python

Security Score

95/100

Audited on Mar 18, 2026

No findings