SkillAgentSearch skills...

Spider

A general physic-based retargeting framework.

Install / Use

/learn @facebookresearch/Spider
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<div align="center">

🕸️ SPIDER: Scalable Physics-Informed DExterous Retargeting

<p align="center"> <a href="https://creativecommons.org/licenses/by-nc/4.0/"> <img src="https://img.shields.io/badge/License-CC_BY--NC_4.0-lightgrey.svg" alt="License: CC BY-NC 4.0"> </a> <a href="https://www.python.org/downloads/"> <img src="https://img.shields.io/badge/python-3.12+-blue.svg" alt="Python 3.12+"> </a> <a href="https://pytorch.org/"> <img src="https://img.shields.io/badge/PyTorch-2.0+-ee4c2c.svg" alt="PyTorch"> </a> <a href="https://arxiv.org/abs/2511.09484"> <img src="https://img.shields.io/badge/arXiv-2406.12345-b31b1b.svg" alt="arXiv"> </a> </p> <p align="center">

<a href="https://jc-bao.github.io/spider-project/"><b>Project Website</b></a><a href="https://facebookresearch.github.io/spider/"><b>Documentation</b></a><a href="https://huggingface.co/datasets/retarget/retarget_example"><b>Dataset</b></a>

</p>

logo

</div>

Overview

Scalable Physics-Informed DExterous Retargeting (SPIDER) is a general framework for physics-based retargeting from human to diverse robot embodiments, including both dexterous hand and humanoid robot. It is designed to be a minimum, flexible and extendable framework for human2robot retargeting. This code base provides the following pipeline from human video to robot actions:

pipeline

Gallery

Simulation results:

| Inspire Pick Tea Pot (Gigahands Dataset) | Xhand Play Glass (Hot3D dataset) | Schunk Pick Board (Oakink dataset) | Allegro Pick Cat Toy (Reconstructed from single RGB video) | | ---------------------------------------- | -------------------------------- | ----------------------------------- | ---------------------------------------------------------- | | | | | |

| G1 Pick | G1 Run | H1 Kick | T1 skip | | ------------------------- | ------------------------ | ------------------------- | ------------------------- | | | | | |

Multiple viewer support:

| Mujoco | Rerun | | ----------------------------------- | ---------------------------------- | | | |

Multiple simulators support:

| Genesis | Mujoco Warp | IsaacGym | | ---------------------------- | ------------------------ | ---------------------- | | | | |

Deployment to real-world robots:

| Pick Cup | Rotate Bulb | Unplug Charger | Pick Duck | | -------------------------------- | ----------------------------------- | ------------------------------ | --------------------------------- | | | | | |

Features

  • First general physics-based retargeting pipeline for both dexterous hand and humanoid robot.
  • Supports 9+ robots and 6+ datasets out of the box.
  • Seemless integration with RL training and data augmentation for BC pipeline.
  • Native support for multiple simulators (Mujoco Wrap, Genesis) and multiple downstream training pipelines (HDMI, DexMachina).
  • Sim2real ready.

Quickstart

Clone example datasets:

sudo apt install git-lfs
git lfs install
git clone https://huggingface.co/datasets/retarget/retarget_example example_datasets

(Option 1) Quickstart with uv:

Create env and install (make sure uv uses Python 3.12, which is what the project targets):

uv sync

If you already have the example datasets cloned, you can skip the preprocessing step where we convert the human data to robot kinematic trajectories. Run SPIDER on a processed trial:

# for gigahand dataset
export TASK=p36-tea
export HAND_TYPE=bimanual
export DATA_ID=0
export ROBOT_TYPE=xhand
export DATASET_NAME=gigahand

# for oakink dataset
export TASK=lift_board
export HAND_TYPE=bimanual
export DATA_ID=0
export ROBOT_TYPE=xhand
export DATASET_NAME=oakink

# run retargeting
uv run examples/run_mjwp.py \
  +override=${DATASET_NAME} \
  task=${TASK} \
  data_id=${DATA_ID} \
  robot_type=${ROBOT_TYPE} \
  embodiment_type=${HAND_TYPE}

For full workflow, please refer to the Workflow section.

(Option 2) Quickstart with conda:

conda create -n spider python=3.12
conda activate spider
python -m pip install --upgrade pip
python -m pip install -r requirements.txt
python -m pip install --no-deps -e .

Run MJWP on a processed trial:

python examples/run_mjwp.py

Workflow

SPIDER is designed to support multiple workflows depending on your simulator of choice and downstream tasks.

  • Native Mujoco Wrap (MJWP) is the default workflow and supports dexterous hand and humanoid robot retargeting.
  • We also support Genesis simulator with DexMachina, workflow is useful for further training a policy with RL for dexterous hand.
  • HDMI workflow supports humanoid robot retargeting + RL workflow with humanoid-object interaction tasks. It use MjLab as its backend simulator.
  • ManipTrans workflow supports dexterous hand retargeting with IsaacGym.

Native Mujoco Wrap Workflow

Please refer to Native Mujoco Wrap workflow for details.

  • supports dexterous hand and humanoid robot retargeting
TASK=p36-tea
HAND_TYPE=bimanual
DATA_ID=0
ROBOT_TYPE=xhand
DATASET_NAME=gigahand

# put your raw data under folder raw/{dataset_name/ in your dataset folder

# read data from self collected dataset
uv run spider/process_datasets/gigahand.py --task=${TASK} --embodiment-type=${HAND_TYPE} --data-id=${DATA_ID}

# decompose object
# here we use fast decompose pipeline with mink
# you can also use decompose.py for original decompose pipeline with CoACD for higher quality decomposition
uv run spider/preprocess/decompose_fast.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --embodiment-type=${HAND_TYPE}

# detect contact (optional)
uv run spider/preprocess/detect_contact.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --embodiment-type=${HAND_TYPE}

# generate scene
uv run spider/preprocess/generate_xml.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --embodiment-type=${HAND_TYPE} --robot-type=${ROBOT_TYPE}

# kinematic retargeting
# here we use fast IK pipeline with mink
# you can also use ik.py for original ik pipeline with mujoco (used in paper)
uv run spider/preprocess/ik_fast.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --embodiment-type=${HAND_TYPE} --robot-type=${ROBOT_TYPE}

# retargeting
# here we use fast retargeting pipeline
# original paper use +override=${DATASET_NAME} which runs a bit slower
uv run examples/run_mjwp.py +override=${DATASET_NAME}_fast task=${TASK} data_id=${DATA_ID} robot_type=${ROBOT_TYPE} embodiment_type=${HAND_TYPE}

# read data for deployment (optional)
uv run spider/postprocess/read_to_robot.py --task=${TASK} --dataset-name=${DATASET_NAME} --data-id=${DATA_ID} --robot-type=${ROBOT_TYPE} --embodiment-type=${HAND_TYPE}

DexMachina Workflow

Please refer to DexMachina workflow for details.

# install dexmachina conda environment following their official instructions: https://mandizhao.github.io/dexmachina-docs/0_install.html
conda activate dexmachina
# note: install spider only without mujoco warp since we only use the optimization part
pip install --ignore-requires-python --no-deps -e .
# run retargeting
python examples/run_dexmachina.py

HDMI Workflow

Please refer to HDMI workflow for details.

# install HDMI uv environment following their official instructions:
# go to hdmi folder, install SPIDER with
uv pip install --no-deps -e ../spider

ManipTrans Workflow

Please refer to ManipTrans workflow for details.

# install maniptrans conda environment following their official instructions: https://github.com/ManipTrans/ManipTrans
conda activate maniptrans
# note: install spider only without mujoco warp since we only use the optimization part
pip install --ignore-requires-python --no-deps -e .
# run retargeting
python examples/run_maniptrans.py

Remote Development

# start rerun server
uv run rerun --serve-web --port 9876

# run SPIDER only with rerun viewer
uv run examples/run_mjwp.py viewer="rerun"

License

SPIDER is released under the Creative Commons Attribution-NonCommercial 4.0 license. See LICENSE for details.

Code of Conduct

We expect everyone to follow the Contributor Covenant Code of Conduct in CODE_OF_CONDUCT.md when participating in this project.

Acknowledgments

  • Thanks Mandi Zhao for the help with the DexMachina workflow for SPIDER + Genesis.
  • Thanks Taylor Howell for the help in the early stages of integrating Mujoco Wrap for SPIDER + MJWP.
  • Thanks Haoyang Weng for the help with the HDMI workflow for SPIDER + Sim2real RL.
  • Inverse kinematics desi

Related Skills

View on GitHub
GitHub Stars406
CategoryDevelopment
Updated3h ago
Forks30

Languages

Python

Security Score

80/100

Audited on Apr 7, 2026

No findings