SkillAgentSearch skills...

Sage

Framework for measuring sim-to-real gaps in robot joint motions. Supports different humanoids with physics simulation, real hardware data collection, and statistical analysis.

Install / Use

/learn @isaac-sim2real/Sage
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<h1 align="center"> SAGE: Sim2Real Actuator Gap Estimator </h1> <div align="center">

Isaac Lab IsaacSim Linux platform License: Apache 2.0 Python 3.10

</div>

Overview

SAGE (Sim2Real Actuator Gap Estimator) is a comprehensive toolkit for analyzing the differences between simulated and real robot joint motions. This project provides systematic tools for measuring, visualizing, and understanding sim-to-real gaps in robotic systems, enabling researchers and engineers to quantify and improve the transfer of robot behaviors from simulation to reality.

SAGE combines:

  • Isaac Sim simulation for physics-based robot motion execution
  • Multi-metric evaluation with statistical analysis and visualization
  • Real robot integration for comprehensive sim-to-real comparison

Table of Contents

Installation

Prerequisites

  • Ubuntu 22.04 LTS
  • NVIDIA GPU with compatible drivers
  • Python 3.10
  • Isaac Sim 5.0.0
  • Isaac Lab 2.2.0

Note: If you are using the provided Docker image, you do not need to install Python, Isaac Sim, and Isaac Lab. These dependencies are pre-installed in the Docker image.

Clone the repository:

git clone https://github.com/isaac-sim2real/sage.git
cd sage

For the rest of installation, you can install the necessary dependencies either directly on your host machine, or by using the provided Docker image.

Host Setup

Follow the Isaac Lab installation guide to set up Isaac Sim and Isaac Lab, and then install the dependencies:

# Install dependencies
pip install -r requirements.txt

Set the PYTHONPATH to allow scripts to find the package:

export PYTHONPATH="$(pwd):${PYTHONPATH}"

Docker Setup

Alternatively if you have Docker and NVIDIA Container Toolkit installed, you can build the Docker image without installing the dependencies on your host machine.

docker build -t sage .

then start the container:

xhost +
docker run --name isaac-lab --entrypoint bash -it --gpus all -e "ACCEPT_EULA=Y" --rm --network=host \
   -e "PRIVACY_CONSENT=Y" \
   -e DISPLAY \
   -v /tmp/.X11-unix:/tmp/.X11-unix \
   -v $HOME/.Xauthority:/root/.Xauthority \
   -v ~/docker/isaac-sim/cache/kit:/isaac-sim/kit/cache:rw \
   -v ~/docker/isaac-sim/cache/ov:/root/.cache/ov:rw \
   -v ~/docker/isaac-sim/cache/pip:/root/.cache/pip:rw \
   -v ~/docker/isaac-sim/cache/glcache:/root/.cache/nvidia/GLCache:rw \
   -v ~/docker/isaac-sim/cache/computecache:/root/.nv/ComputeCache:rw \
   -v ~/docker/isaac-sim/logs:/root/.nvidia-omniverse/logs:rw \
   -v ~/docker/isaac-sim/data:/root/.local/share/ov/data:rw \
   -v ~/docker/isaac-sim/documents:/root/Documents:rw \
   -v $(pwd):/app:rw \
   sage

and run the rest of the commands in the container.

Usage

Simulation Execution

Execute robot motions in Isaac Sim simulation:

${ISAACSIM_PATH}/python.sh scripts/run_simulation.py \
    --robot-name h1_2 \
    --motion-source amass \
    --motion-files motion_files/h1_2/amass \
    --valid-joints-file configs/h1_2_valid_joints.txt \
    --output-folder output \
    --fix-root \
    --physics-freq 200 \
    --render-freq 200 \
    --control-freq 50 \
    --kp 100 \
    --kd 2 \
    --headless

This should take about 20 minutes to complete. For debugging purposes, you can run the script without the --headless flag to visualize the simulation.

Data Analysis

Generate comprehensive analysis reports comparing simulation and real robot data:

python scripts/run_analysis.py \
    --robot-name h1_2 \
    --motion-source amass \
    --motion-names "*" \
    --output-folder output \
    --valid-joints-file configs/h1_2_valid_joints.txt

Outputs:

  • Metrics Excel files with RMSE, MAPE, correlation, cosine similarity
  • Visualization plots for individual joint comparisons (position, velocity, torque)
  • Statistical boxplots comparing simulation vs real robot performance

Real Robot Integration

Different robot manufacturers provide different control APIs and dependencies. For example, Unitree robots require ROS2 and Unitree SDK, Realman robots require their proprietary Robotic_Arm_Custom package, while LeRobot SO-101 requires the lerobot package. See the robot-specific documentation below for detailed installation instructions.

Supported robots:

  • Unitree G1 and H1-2 humanoid robots
  • Realman WR75S dual-arm robot
  • LeRobot SO-101 follower arm

Use the unified script to collect motion data on real robots (with custom robot-specific parameters):

python scripts/run_real.py \
    --robot-name {g1|h12|realman|so101} \
    --motion-files 'path/to/your/motion_sequence.txt' \
    --output-folder 'path/to/your/output_folder'

For detailed setup instructions, usage examples, and robot-specific configurations, refer to:

OSMO Workflow

We have created an OSMO workflow for one-click submission of simulation and analysis tasks. The results will be collected in the form of OSMO datasets. Please refer to the OSMO documentation to onboard OSMO, and set up the NGC registry credentials:

osmo credential set my-ngc-cred \
    --type REGISTRY \
    --payload registry=nvcr.io \
    username='$oauthtoken' \
    auth=<ngc_api_key>

The workflow can be submitted using the following command:

osmo workflow submit osmo_workflow.yaml

The results can be downloaded using the following command:

osmo dataset download sage ./

Note: The osmo_workflow.yaml file is configured to run simulations for all 3 robots (h1_2, g1, wr75s) with all AMASS motion files on the A40 node of the OSMO platform. If necessary, you can modify this configuration file to suit different resource requirements or to run a subset of robots/motion files.

Data Format

Motion Files

Motion files contain joint trajectories retargeted to specific robots. Located in motion_files/{robot_name}/{source}/.

Format:

  • Line 1: Joint names (comma-separated)
  • Line 2+: Joint angles in radians (comma-separated)

Motion Sources:

  • AMASS: Motion capture data from AMASS Dataset
  • Retargeting: Convert motion capture to robot morphology. Various retargeting methods exist, e.g., see Human2Humanoid

Simulation Output

Generated in output/sim/{robot_name}/{source}/{motion_name}/:

  • control.csv: Command positions sent to robot (radians)
  • state_motor.csv: Actual joint states (positions, velocities, torques) (radians)
  • joint_list.txt: Joint configuration

CSV Format:

type,timestamp,positions,velocities,torques
CONTROL/STATE_MOTOR,0.0,"[angle1, angle2, ...]","[vel1, vel2, ...]","[torque1, torque2, ...]"

Real Robot Output

Generated in output/real/{robot_name}/{source}/{motion_name}/:

  • control.csv: Commands sent to real robot (radians)
  • state_motor.csv: Measured joint states (radians)
  • state_base.csv: IMU/base measurements
  • event.csv: Event timestamps

Key Differences from Simulation:

  • Timestamps in microseconds (vs. seconds)
  • Additional columns in state_motor.csv: temperatures, currents
  • Type names: Control/StateMotor (vs. CONTROL/STATE_MOTOR)
  • Irregular timing due to real-world constraints

Processed Sim2Real Datasets

After collecting both simulation and real robot data pairs, we process them into structured datasets suitable for training sim2real gap compensation models. These datasets align temporal sequences and provide paired observations for machine learning approaches.

The complete dataset containing both Unitree and RealMan robot data is available for download: PKU Disk Link.

Unitree Dataset

This dataset captures complex upper-body motions of the H1-2 humanoid robot under varying payload conditions (0 kg, 1 kg, 2 k

Related Skills

View on GitHub
GitHub Stars80
CategoryCustomer
Updated4d ago
Forks9

Languages

Python

Security Score

100/100

Audited on Mar 28, 2026

No findings