SkillAgentSearch skills...

Tink

The official implementation of Tink: Transferring hand's Interaction among objects

Install / Use

/learn @oakink/Tink
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<p align="center"> <img src="imgs/oakink_logo.png"" alt="Logo" width="20%"> </p> <h1 align="center"> Tink</h1> <p align="center"> <strong>CVPR, 2022</strong> <br /> <a href="https://lixiny.github.io"><strong>Lixin Yang*</strong></a> · <a href="https://kailinli.top"><strong>Kailin Li*</strong></a> · <a href=""><strong>Xinyu Zhan*</strong></a> · <a href=""><strong>Fei Wu</strong></a> · <a href="https://anran-xu.github.io"><strong>Anran Xu</strong></a> . <a href="https://liuliu66.github.io"><strong>Liu Liu</strong></a> · <a href="https://mvig.sjtu.edu.cn"><strong>Cewu Lu</strong></a> <br /> \star = equal contribution </p> <p align="center"> <a href='https://openaccess.thecvf.com/content/CVPR2022/html/Yang_OakInk_A_Large-Scale_Knowledge_Repository_for_Understanding_Hand-Object_Interaction_CVPR_2022_paper.html'> <img src='https://img.shields.io/badge/Paper-PDF-yellow?style=flat&logo=googlescholar&logoColor=blue' alt='Paper PDF'> </a> <a href='https://arxiv.org/abs/2203.15709' style='padding-left: 0.5rem;'> <img src='https://img.shields.io/badge/ArXiv-PDF-green?style=flat&logo=arXiv&logoColor=green' alt='ArXiv PDF'> </a> <a href='https://oakink.net' style='padding-left: 0.5rem;'> <img src='https://img.shields.io/badge/Project-Page-blue?style=flat&logo=Google%20chrome&logoColor=blue' alt='Project Page'> <a href='https://www.youtube.com/watch?v=vNTdeXlLdU8' style='padding-left: 0.5rem;'> <img src='https://img.shields.io/badge/Youtube-Video-red?style=flat&logo=youtube&logoColor=red' alt='Youtube Video'> </a> </p>

This repo contains the official implementation of Tink -- one of the core contributions in the CVPR2022 paper: OakInk.

Tink is a novel method that Transfers the hand's INteraction Knowledge among objects.

tink

Installation

  • First, clone this repo:

    git clone https://github.com/KailinLi/Tink.git
    cd Tink
    git submodule init && git submodule update
    
  • Second, to set up the environment, follow the instruction: stand-alone in OakInk to install the environment with conda.

  • Third, inside the OakInk directory, install the oikit as package:

    $ cd OakInk
    $ pip install .
    

Download

In this repo, we provide a mini dataset to demonstrate the pipeline of Tink.

  • Download the assets files.
  • Download mano following the official instructions. And put the mano_v1_2 under the assets directory.
  • Download the mini dataset from this link. And unzip them under the DeepSDF_OakInk directory.

Your directory should look like this:

Tink
├── assets
│   ├── anchor
│   ├── hand_palm_full.txt
│   └── mano_v1_2
├── DeepSDF_OakInk
│   ├── data
│   │   ├── meta
│   │   ├── OakInkObjects
│   │   ├── OakInkVirtualObjects
│   │   ├── raw_grasp
│   │   └── sdf
│   │       └── phone

DeepSDF

In this section, we demonstrate how to preprocess the object meshes and train a category-level DeepSDF.

If you are not interested in training DeepSDF, feel free to skip this section.

1. Compile the C++ code

Please follow the official instructions of DeepSDF.

You will get two executables in the DeepSDF_OakInk/bin directory. (We modified some of the original source code in DeepSDF, so please make sure to compile these scripts from the scratch.)

2. Preprocess the object meshes

export MESA_GL_VERSION_OVERRIDE=3.3
export PANGOLIN_WINDOW_URI=headless://

cd DeepSDF_OakInk
python preprocess_data.py --data_dir data/sdf/phone --threads 4

After finishing the script, you can find the SDF files in DeepSDF_OakInk/data/sdf/phone/SdfSamples directory.

3. Train the network

CUDA_VISIBLE_DEVICES=0 python train_deep_sdf.py -e data/sdf/phone

4. Dump the latent codes and reconstructed meshes

CUDA_VISIBLE_DEVICES=0 python reconstruct_train.py -e data/sdf/phone  --mesh_include

You can find the reconstructed meshes under the DeepSDF_OakInk/data/sdf/phone/Reconstructions/Meshes.

Shape Interpolation

If you skip the above section, we provide a pre-trained DeepSDF network. Please download the files, unzip them and replace the original phone directory:

sdf
├── phone
│   ├── network
│   │   ├── ModelParameters
│   │   │   └── latest.pth
│   │   └── LatentCodes
│   ├── Reconstructions
│   │   ├── Codes
│   │   │   ├── C52001.pth
│   │   │   ├── ...
│   │   └── Meshes
│   │       ├── C52001.ply
│   │       ├── ...
│   ├── rescale.pkl
│   ├── SdfSamples
│   │   ├── C52001.npz
│   │   ├── ...
│   ├── SdfSamples_resize
│   ├── specs.json
│   └── split.json

Now, go to the Tink directory, and run the following script to generate the interpolations:

cd ..

# you can generate all of the interpolations:
python tink/gen_interpolate.py --all -d ./DeepSDF_OakInk/data/sdf/phone

# or just interpolate between two objects (from C52001 to o52105):
python tink/gen_interpolate.py -d ./DeepSDF_OakInk/data/sdf/phone -s C52001 -t o52105

You can find the interpolations in DeepSDF_OakInk/data/sdf/phone/interpolate directory.

Calculate Contact Info

We calculate the contact region of C52001:

python tink/cal_contact_info.py \
	-d ./DeepSDF_OakInk/data/sdf/phone \
	-s C52001 \
	--tag demo \
	-p DeepSDF_OakInk/data/raw_grasp/demo/C52001_0001_0000/2021-10-09-15-13-39/dom.pkl \
	--vis

The contact_info.pkl is stored in DeepSDF_OakInk/data/sdf/phone/contact/C52001/demo_e54965ec08. e54965ec08 is the hash code of the hand parameters.

<img src="imgs/contact.png" alt="contact" style="zoom:50%;" />

Contact Mapping

We take the virtual object o52105 as an example.

To transfer the contact information from C52001 to o52105:

python tink/info_transform.py \
	-d ./DeepSDF_OakInk/data/sdf/phone \
	-s C52001 \
	-t o52105 \
	-p DeepSDF_OakInk/data/sdf/phone/contact/C52001/demo_e54965ec08

You can find the transfered contact info in DeepSDF_OakInk/data/sdf/phone/contact/C52001/demo_e54965ec08/o52105.

Pose Refinement

CUDA_VISIBLE_DEVICES=0 python tink/pose_refine.py \
	-d ./DeepSDF_OakInk/data/sdf/phone \
	-s C52001 \
	-t o52105 \
	-p DeepSDF_OakInk/data/sdf/phone/contact/C52001/demo_e54965ec08 \
	--vis

The fitted hand pose will be stored in DeepSDF_OakInk/data/sdf/phone/contact/C52001/demo_e54965ec08/o52105 directory. (When visualizing the hand pose, you might need to chick the 'x' on the window status bar to start fitting.)

<img src="imgs/refine.gif" alt="refine" style="zoom:50%;" />

We also provide all the transferred hand poses of the mini dataset. You can download the files, unzip them and replace the original phone directory.

all_refine

Related Skills

View on GitHub
GitHub Stars36
CategoryDevelopment
Updated19d ago
Forks3

Languages

Python

Security Score

75/100

Audited on Mar 15, 2026

No findings