RDF
Learning Robot Geometry as Distance Fields: Applications to Whole-body Manipulation
Install / Use
/learn @yimingli1998/RDFREADME
Code for paper "Learning Robot Geometry as Distance Fields: Applications to Whole-body Manipulation"
<img src='robot_sdf.gif'/>Dependencies
- Python version: 3.8 (Tested)
- Pytorch version:1.13.0 (Tested)
- Install necessary packages
pip install -r requirements.txt
- Install Chamfer Distance (optional, only used for evaluating the chamfer distance)
Usage
Run RDF represented with basis functions
python bf_sdf.py --n_func 24 --device cuda
Given points with size (N,3) and joint configurations (B,7), it will output SDF values (B,N) and gradients w.r.t. both points (analytical, with shape(B,N,3)) and joints (numerical, with shape(B,N,7)).
You can also uncomment the code
# # visualize the Bernstein Polynomial model for each robot link
# bp_sdf.create_surface_mesh(model,nbData=128,vis=True)
# # visualize the Bernstein Polynomial model for the whole body
# theta = torch.tensor([0, -0.3, 0, -2.2, 0, 2.0, np.pi/4]).float().to(args.device).reshape(-1,7)
# pose = torch.from_numpy(np.identity(4)).to(args.device).reshape(-1, 4, 4).expand(len(theta),4,4).float()
# trans_list = panda.get_transformations_each_link(pose,theta)
# utils.visualize_reconstructed_whole_body(model, trans_list, tag=f'BP_{args.n_func}')
to visualize the reconstructed meshes for each robot link and whole body, respectively.
We provide some pretrained models in models.
Visualize the SDF
We provide 2D and 3D visualization for produced SDF and gradient, just run
python vis.py
to see the results.
Train RDF model with basis functions
Generate SDF data for each robot link:
python sample_sdf_points.py
It computes the SDF value based on mesh_to_sdf. The sampled points and sdf values are saved in data/sdf_points. You can also download the data here. After this, please put '*.npy' files in data/sdf_points/
Then just run
python bf_sdf.py --train --n_func 8 --device cuda
to learn weights of basis functions. Normally it will take 1~2 minutes to train a model when using 8 basis functions.
Evaluation
For instance, you can run
# method:[BP_8,BP_24,NN_LD,NN_AD,Sphere]
# type: [RDF,LINK]
python eval.py --device cuda --method BP_8 --type RDF
to evaluate the quality of RDF. BP_8 and BP_24 donate numbers of basis functios we use, while NN_LD and NN_AD donate nn models trained with limited data and argumented data.
Dual arm grasp planning
You can run
python bbo_planning.py
to see how our RDF model can be used for whole arm lifting task with Gauss-Newton algorithm. It will plan 3 valid joint configurations for both arms.
Train RDF for your own robot
-
Build a differentiable robot layer for forward kinematics (see
panda_layer/panda_layer.pyfor details) -
Train RDF model using basis functions (We use .stl file for SDF computation and reconstruction, which can be found in the URDF file)
-
Use it!
Note: Another option is to use the pytorch kinematics library to parse the urdf file automatically to build RDF for your own robot: https://github.com/UM-ARM-Lab/pytorch_kinematics An exemple is available at panda_layer/panda_layer_pk.py
Citation
If you find this work useful in your research, please cite:
@inproceedings{li2024representing,
author={Li, Yiming and Zhang, Yan and Razmjoo, Amirreza and Calinon, Sylvain},
title={Representing Robot Geometry as Distance Fields: Applications to Whole-body Manipulation},
booktitle={Proc.\ {IEEE} Intl Conf.\ on Robotics and Automation ({ICRA})},
year={2024},
pages={15351--15357}
}
RDF is maintained by Yiming LI and licensed under the MIT License.
The urdf file we used is licensed under the Apache License.
Copyright (c) 2023 Idiap Research Institute <contact@idiap.ch>
Related Skills
YC-Killer
2.7kA library of enterprise-grade AI agents designed to democratize artificial intelligence and provide free, open-source alternatives to overvalued Y Combinator startups. If you are excited about democratizing AI access & AI agents, please star ⭐️ this repository and use the link in the readme to join our open source AI research team.
groundhog
398Groundhog's primary purpose is to teach people how Cursor and all these other coding agents work under the hood. If you understand how these coding assistants work from first principles, then you can drive these tools harder (or perhaps make your own!).
last30days-skill
13.8kAI agent skill that researches any topic across Reddit, X, YouTube, HN, Polymarket, and the web - then synthesizes a grounded summary
000-main-rules
Project Context - Name: Interactive Developer Portfolio - Stack: Next.js (App Router), TypeScript, React, Tailwind CSS, Three.js - Architecture: Component-driven UI with a strict separation of conce
