SkillAgentSearch skills...

UniRig

[SIGGRAPH 2025] One Model to Rig Them All: Diverse Skeleton Rigging with UniRig

Install / Use

/learn @VAST-AI-Research/UniRig
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

UniRig: One Model to Rig Them All

<div align="center">

Project Page Paper Model

</div>

teaser

This repository contains the official implementation for the SIGGRAPH'25 (TOG) UniRig framework, a unified solution for automatic 3D model rigging, developed by Tsinghua University and Tripo.

Paper: One Model to Rig Them All: Diverse Skeleton Rigging with UniRig

Overview

Rigging 3D models – creating a skeleton and assigning skinning weights – is a crucial but often complex and time-consuming step in 3D animation. UniRig tackles this challenge by introducing a novel, unified framework leveraging large autoregressive models to automate the process for a diverse range of 3D assets.

Combining UniRig with keyframe animation produces these following results:

| devil | dragon | rabbit | |:-----------------------------:|:-------------------------------:|:-------------------------------:|

The full UniRig system consists of two main stages:

  1. Skeleton Prediction: An GPT-like transformer autoregressively predicts a topologically valid skeleton hierarchy using a novel Skeleton Tree Tokenization scheme.
  2. Skinning Weight & Attribute Prediction: A Bone-Point Cross Attention mechanism predicts per-vertex skinning weights and relevant bone attributes (e.g., for physics simulation) based on the predicted skeleton and input mesh geometry.

This repository provides the code implementation for the entire framework vision, with components being released progressively.

Key Features (Full UniRig Framework)

  • Unified Model: Aims to handle diverse model categories (humans, animals, objects) with a single framework.
  • Automated Skeleton Generation: Predicts topologically valid skeleton structures. (✅ Available in current release)
  • Automated Skinning Prediction: Predicts per-vertex skinning weights. (✅ Available in current release)
  • Bone Attribute Prediction: Predicts attributes like stiffness for physics-based secondary motion. (⏳ Coming Soon)
  • High Accuracy & Robustness: Achieves state-of-the-art results on challenging datasets (as shown in the paper with Rig-XL/VRoid training).
  • Efficient Tokenization: Uses Skeleton Tree Tokenization for compact representation and efficient processing.
  • Human-in-the-Loop Ready: Designed to potentially support iterative refinement workflows.

🚨 Current Release Status & Roadmap 🚨

We are open-sourcing UniRig progressively. Please note the current status:

Available Now (Initial Release):

  • Code: Implementation for skeleton and skinning prediction.
  • Model: Skeleton & Skinning Prediction checkpoint trained on Articulation-XL2.0. Available on Hugging Face.
  • Dataset: Release of the Rig-XL and VRoid datasets used in the paper. We also filtered out 31 broken models in the training dataset which do not affect the performance of the final model.
  • ✅ Training code.

Planned Future Releases:

  • ⏳ Full UniRig model checkpoints (Skeleton + Skinning) trained on Rig-XL/VRoid, replicating the paper's main results.

We appreciate your patience as we prepare these components for release. Follow VAST-AI-Research announcements for updates!

Installation

  1. Prerequisites:

    • Python 3.11
    • PyTorch (tested with version >=2.3.1)
  2. Clone the repository:

    git clone https://github.com/VAST-AI-Research/UniRig
    cd UniRig
    
  3. Set up a virtual environment (recommended):

    conda create -n UniRig python=3.11
    conda activate UniRig
    
  4. Install dependencies:

    python -m pip install torch torchvision
    python -m pip install -r requirements.txt
    python -m pip install spconv-{you-cuda-version}
    python -m pip install torch_scatter torch_cluster -f https://data.pyg.org/whl/torch-{your-torch-version}+{your-cuda-version}.html --no-cache-dir
    python -m pip install numpy==1.26.4
    

spconv is installed from this repo, torch_scatter and torch_cluster are installed from this site. Also, there is a high chance that you will encounter flash_attn installation error, go to its original repo and follow its installation guide.

  1. Download Model Checkpoint: The currently available skeleton prediction model checkpoint is hosted on Hugging Face and will typically be downloaded automatically by the provided scripts/functions.

  2. (Optional, for importing/exporting .vrm) Install the blender addon: The blender addon is modifed from VRM-Addon-for-Blender.

    Make sure you are in the root directory of the project, then:

    python -c "import bpy, os; bpy.ops.preferences.addon_install(filepath=os.path.abspath('blender/add-on-vrm-v2.20.77_modified.zip'))"
    

RigXL Dataset

processed data link

Notice that aside from vroid, all models are selected from Objaverse. Just download mapping.json if you already have Objaverse dataset (or need to download from web).

The json contains all ids of the models with type indicating their category and url specifying where to download. url is the same with fileIdentifier in Objaverse.

Training/validation split is put in datalist folder.

📝 Note:
All floating-point values are stored in float16 format for compression.

Visualize Data

Put the dataset in dataset_clean, go back to root, and run the command to export FBX model:

from src.data.raw_data import RawData
raw_data = RawData.load("dataset_clean/rigxl/12345/raw_data.npz")
raw_data.export_fbx("res.fbx")
<details> <summary><strong>📁 Dataset Format</strong> (click to expand)</summary>

🔑 Keys of Data

All models are converted into world space.

  • vertices:
    Position of the vertices of the mesh, shape (N, 3).

  • vertex_normals:
    Normals of the vertices, processed by Trimesh, shape (N, 3).

  • faces:
    Indices of mesh faces (triangles), starting from 0, shape (F, 3).

  • face_normals:
    Normals of the faces, shape (F, 3).

  • joints:
    Positions of the armature joints, shape (J, 3).

  • skin:
    Skinning weights for each vertex, shape (N, J).

  • parents:
    Parent index of each joint, where parents[0] is always None (root), shape (J).

  • names:
    Name of each joint.

  • matrix_local:
    The local axis of each bone; aligned to Y-up axis, consistent with Blender.

</details>

Usage

Skeleton Prediction

Generate a skeleton for your 3D model using our pre-trained model. The process automatically analyzes the geometry and predicts an appropriate skeletal structure.

# Process a single file
bash launch/inference/generate_skeleton.sh --input examples/giraffe.glb --output results/giraffe_skeleton.fbx

# Process multiple files in a directory
bash launch/inference/generate_skeleton.sh --input_dir <your_input_directory> --output_dir <your_output_directory>

# Try different skeleton variations by changing the random seed
bash launch/inference/generate_skeleton.sh --input examples/giraffe.glb --output results/giraffe_skeleton.fbx --seed 42

Supported input formats: .obj, .fbx, .glb, and .vrm

Skinning Weight Prediction

# Skin a single file
bash launch/inference/generate_skin.sh --input examples/skeleton/giraffe.fbx --output results/giraffe_skin.fbx

# Process multiple files in a directory
bash launch/inference/generate_skin.sh --input_dir <your_input_directory> --output_dir <your_output_directory>

Note that the command above uses an edited-version from skeleton phase. The results may degrade significantly if the skeleton is inaccurate — for example, if tail bones or wing bones are missing. Therefore, it is recommended to refine the skeleton before performing skinning in order to achieve better results.

Merge the Predicted Results

Combine the predicted skeleton with your original 3D model to create a fully rigged asset:

# Merge skeleton from skeleton prediction
bash launch/inference/merge.sh --source results/giraffe_skeleton.fbx --target examples/giraffe.glb --output results/giraffe_rigged.glb

# Or merge skin from skin prediction
bash launch/inference/merge.sh --source results/giraffe_skin.fbx --target examples/giraffe.glb --output results/giraffe_rigged.glb

Note that there will be no skinning if you try to merge a skeleton file(giraffe_skeleton.fbx). Use the predicted skinning result(giraffe_skin.fbx) instead!

Rignet Dataset Validation

Validate the metrics mentioned in the paper. This is for academic usage.

First, Download the processed dataset from Hugging Face and extract it to the dataset_clean.

Then run the following command:

python run.py --task=configs/task/validate_rignet.yaml

To export skeleton & mesh, set record_res to True in the config file configs/system/ar_validate_rignet.yaml.

Train from Scratch

The code may be

View on GitHub
GitHub Stars1.5k
CategoryDevelopment
Updated3h ago
Forks135

Languages

Python

Security Score

100/100

Audited on Apr 2, 2026

No findings