Deco
Estimate vertex-level 3D human-scene and human-object contacts across the full body mesh
Install / Use
/learn @sha2nkt/DecoREADME
DECO: Dense Estimation of 3D Human-Scene Contact in the Wild [ICCV 2023 (Oral)]
Code repository for the paper:
DECO: Dense Estimation of 3D Human-Scene Contact in the Wild
Shashank Tripathi, Agniv Chatterjee, Jean-Claude Passy, Hongwei Yi, Dimitrios Tzionas, Michael J. Black<br /> IEEE International Conference on Computer Vision (ICCV), 2023

[Project Page] [Paper] [Video] [Poster] [License] [Contact]
News :triangular_flag_on_post:
- [2024/05/28] :eight_pointed_black_star: Damon object-wise contacts are released in SMPL and SMPL-X format. Please refer here for details.
- [2024/01/31] The DAMON contact labels in SMPL-X format have been released. This is the conversion script.
- [2023/10/12] The huggingface demo has been released.
- [2023/10/10] The colab demo has been released. Huggingface demo coming soon...
Installation and Setup
- First, clone the repo. Then, we recommend creating a clean conda environment, activating it and installing torch and torchvision, as follows:
git clone https://github.com/sha2nkt/deco.git
cd deco
conda create -n deco python=3.9 -y
conda activate deco
pip install torch==1.13.0+cu117 torchvision==0.14.0+cu117 --extra-index-url https://download.pytorch.org/whl/cu117
Please adjust the CUDA version as required.
- Install PyTorch3D from source. Users may also refer to PyTorch3D-install for more details.
However, our tests show that installing using
condasometimes runs into dependency conflicts. Hence, users may alternatively install Pytorch3D from source following the steps below.
git clone https://github.com/facebookresearch/pytorch3d.git
cd pytorch3d
pip install .
cd ..
- Install the other dependancies and download the required data.
pip install -r requirements.txt
sh fetch_data.sh
- Please download SMPL (version 1.1.0) and SMPL-X (v1.1) files into the data folder. Please rename the SMPL files to
SMPL_FEMALE.pkl,SMPL_MALE.pklandSMPL_NEUTRAL.pkl. The directory structure for thedatafolder has been elaborated below:
├── preprocess
├── smpl
│ ├── SMPL_FEMALE.pkl
│ ├── SMPL_MALE.pkl
│ ├── SMPL_NEUTRAL.pkl
│ ├── smpl_neutral_geodesic_dist.npy
│ ├── smpl_neutral_tpose.ply
│ ├── smplpix_vertex_colors.npy
├── smplx
│ ├── SMPLX_FEMALE.npz
│ ├── SMPLX_FEMALE.pkl
│ ├── SMPLX_MALE.npz
│ ├── SMPLX_MALE.pkl
│ ├── SMPLX_NEUTRAL.npz
│ ├── SMPLX_NEUTRAL.pkl
│ ├── smplx_neutral_tpose.ply
├── weights
│ ├── pose_hrnet_w32_256x192.pth
├── J_regressor_extra.npy
├── base_dataset.py
├── mixed_dataset.py
├── smpl_partSegmentation_mapping.pkl
├── smpl_vert_segmentation.json
└── smplx_vert_segmentation.json
<a name="damon-data-description"></a>
Download the DAMON dataset
⚠️ Register account on the DECO website, and then use your username and password to login to the Downloads page.
Follow the instructions on the Downloads page to download the DAMON dataset. The provided metadata in the npz files is described as follows:
imgname: relative path to the image filepose: SMPL pose parameters inferred from CLIFFtransl: SMPL root translation inferred from CLIFFshape: SMPL shape parameters inferred from CLIFFcam_k: camera intrinsic matrix inferred from CLIFFpolygon_2d_contact: 2D contact annotation from HOTcontact_label: 3D contact annotations on the SMPL meshcontact_label_smplx: 3D contact annotation on the SMPL-X meshcontact_label_objectwise: 3D contact annotations split into separate object labels on the SMPL meshcontact_label_smplx_objectwise: 3D contact annotations split into separate object labels on the SMPL-X meshscene_seg: path to the scene segmentation map from Mask2Formerpart_seg: path to the body part segmentation map
The order of values is the same for all the keys.
<a name="convert-damon"></a>
Converting DAMON contact labels to SMPL-X format (and back)
To convert contact labels from SMPL to SMPL-X format and vice-versa, run the following command
python reformat_contacts.py \
--contact_npz datasets/Release_Datasets/damon/hot_dca_trainval.npz \
--input_type 'smpl'
Run demo on images
The following command will run DECO on all images in the specified --img_src, and save rendering and colored mesh in --out_dir. The --model_path flag is used to specify the specific checkpoint being used. Additionally, the base mesh color and the color of predicted contact annotation can be specified using the --mesh_colour and --annot_colour flags respectively.
python inference.py \
--img_src example_images \
--out_dir demo_out
Training and Evaluation
We release 3 versions of the DECO model:
<ol> <li> DECO-HRNet (<em> Best performing model </em>) </li> <li> DECO-HRNet w/o context branches </li> <li> DECO-Swin </li> </ol>All the checkpoints have been downloaded to checkpoints.
However, please note that versions 2 and 3 have been trained solely on the RICH dataset. <br>
We recommend using the first DECO version.
Please download the actual DAMON dataset from the website and place it in datasets/Release_Datasets following the instructions given.
Evaluation
To run evaluation on the DAMON dataset, please run the following command:
python tester.py --cfg configs/cfg_test.yml
Training
The config provided (cfg_train.yml) is set to train and evaluate on all three datasets: DAMON, RICH and PROX. To change this, please change the value of the key TRAINING.DATASETS and VALIDATION.DATASETS in the config (please also change TRAINING.DATASET_MIX_PDF as required). <br>
Also, the best checkpoint is stored by default at checkpoints/Other_Checkpoints.
Please run the following command to start training of the DECO model:
python train.py --cfg configs/cfg_train.yml
Training on custom datasets
To train on other datasets, please follow these steps:
- Please create an npz of the dataset, following the structure of the datasets in
datasets/Release_Datasetswith the corresponding keys and values. - Please create scene segmentation maps, if not available. We have used Mask2Former in our work.
- For creating the part segmentation maps, this sample script can be referred to.
- Add the dataset name(s) to
train.py(these lines),tester.py(these lines) anddata/mixed_dataset.py(these lines), according to the body model being used (SMPL/SMPL-X) - Add the path(s) to the dataset npz(s) to
common/constants.py(these lines). - Finally, change
TRAINING.DATASETSandVALIDATION.DATASETSin the config file and you're good to go!
Citing
If you find this code useful for your research, please consider citing the following paper:
@InProceedings{tripathi2023deco,
author = {Tripathi, Shashank and Chatterjee, Agniv and Passy, Jean-Claude and Yi, Hongwei and Tzionas, Dimitrios and Black, Michael J.},
title = {{DECO}: Dense Estimation of {3D} Human-Scene Contact In The Wild},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2023},
pages = {8001-8013}
}
License
See LICENSE.
Acknowledgments
We sincerely thank Alpar Cseke for his contributions to DAMON data collection and PHOSA evaluations, Sai K. Dwivedi for facilitating PROX downstream experiments, Xianghui Xie for his generous he
