SkillAgentSearch skills...

Hop

Hand-object interaction Pretraining From Videos

Install / Use

/learn @hgaurav2k/Hop
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

Hand-object Interaction Pretraining from Videos

This repo contains code for the paper Hand-object interaction Pretraining from Videos

<!-- Published in the International Conference of Computer Vision and Pattern Recognition (CVPR) 2019. -->

For a brief overview, check out the project webpage!

<img src='imgs/approach.png'>

For any questions, please contact Himanshu Gaurav Singh.

Setup

  • Create conda environment using conda env create -f env.yml
  • Install IsaacGym in this environment.
  • Download the asset folder and put them in the root directory.

Running the code

Pretraining

  • Download the hand-object interaction dataset from here. Extract using tar -xf hoi_pretraining_data.tar.xz. Put it under the root directory.
  • Run bash scripts/pretrain.sh <DATADIR>

Finetuning

  • Download pretrained checkpoint from here. You can also use your own trained checkpoint.
  • For your choice of task, run bash scripts/finetune/finetune_{task}.sh.
<!-- -->

Visualising trained policies

  • Run bash scripts/run_policy.sh <PATH_TO_POLICY>.

Citation

Acknowledgment

This work was supported by the DARPA Machine Common Sense program, the DARPA Transfer from Imprecise and Abstract Models to Autonomous Technologies (TIAMAT) program, and by the ONR MURI award N00014-21-1-2801. This work was also funded by ONR MURI N00014-22-1-2773. We thank Adhithya Iyer for assistance with teleoperation systems, Phillip Wu for setting-up the real robot, and Raven Huang, Jathushan Rajasegaran and Yutong Bai for helpful discussions.

View on GitHub
GitHub Stars116
CategoryContent
Updated16d ago
Forks10

Languages

Python

Security Score

80/100

Audited on Mar 18, 2026

No findings