ContactOpt
Physical contact plays a critical role in hand and object grasping. By estimating desirable contact then optimizing the hand pose to achieve it, ContactOpt improves the accuracy and realism of estimated hand and object poses.
Install / Use
/learn @facebookresearch/ContactOptREADME
ContactOpt: Optimizing Contact to Improve Grasps
Patrick Grady, Chengcheng Tang, Christopher D. Twigg, Minh Vo, Samarth Brahmbhatt, Charles C. Kemp

Physical contact between hands and objects plays a critical role in human grasps. We show that optimizing the pose of a hand to achieve expected contact with an object can improve hand poses inferred via image-based methods. Given a hand mesh and an object mesh, a deep model trained on ground truth contact data infers desirable contact across the surfaces of the meshes. Then, ContactOpt efficiently optimizes the pose of the hand to achieve desirable contact using a differentiable contact model. Notably, our contact model encourages mesh interpenetration to approximate deformable soft tissue in the hand. In our evaluations, our methods resulted in grasps that better matched ground truth contact, had lower kinematic error, and were significantly preferred by human participants.
[Paper] [Paper website] [Supplementary] [Video]
Installation
Refer to installation instructions.
Run ContactOpt on the demo
A small dataset of 10 grasps from an image-based pose estimator has been included (paper, section 4.2.2). To run ContactOpt on this demo dataset:
python contactopt/run_contactopt.py --split=demo
To calculate aggregate statistics:
python contactopt/run_eval.py --split=demo
To visualize the individual results, run the evaluation script with the --vis flag. Pan/rotation/zoom are controlled with the mouse, and press Q to advance to the next frame.
python contactopt/run_eval.py --split=demo --vis

Run ContactOpt on user-provided data
A demo script has been provided to demonstrate running ContactOpt on a single hand/object pair. This file may be easily modified to run on your own data. Demo files containing the object mesh and mano parameters have been provided.
python contactopt/run_user_demo.py
python contactopt/run_eval.py --split=user --vis
Note that this project uses the MANO pose parameterized with 15 PCA components. Other projects may use the MANO model with different formats (such as 45 individual joint angles). The contactopt.util.fit_pca_to_axang function has been provided to convert between these modalities.
Running on Datasets
To run ContactOpt on the datasets described in the paper, download and generate the dataset as described in the installation document. Use the --split=aug flag for Perturbed ContactPose, or --split==im for the image-based pose estimates.
python contactopt/run_contactopt.py --split=aug
python contactopt/run_eval.py --split=aug
Training DeepContact
The DeepContact network is trained on the Perturbed ContactPose dataset.
python contactopt/train_deepcontact.py
Citation
@inproceedings{grady2021contactopt,
author={Grady, Patrick and Tang, Chengcheng and Twigg, Christopher D. and Vo, Minh and Brahmbhatt, Samarth and Kemp, Charles C.},
title = {{ContactOpt}: Optimizing Contact to Improve Grasps},
booktitle = {Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2021}
}
License
The code for this project is released under the MIT License.
Related Skills
node-connect
342.5kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
85.3kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
342.5kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
342.5kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
