SkillAgentSearch skills...

ADLD

IEEE Transactions on Affective Computing "Unconstrained Facial Action Unit Detection via Latent Feature Domain"

Install / Use

/learn @ZhiwenShao/ADLD
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

ADLD

This repository implements the training and testing of ADLD for Unconstrained Facial Action Unit Detection via Latent Feature Domain. The repository offers the implementation of the paper in PyTorch

Getting Started

Installation

  • This code was tested with PyTorch 0.4.0 and Python 2.7
  • Clone this repo:
git clone https://github.com/ZhiwenShao/ADLD
cd ADLD

Datasets

Put BP4D and EmotioNet into the folder "dataset" following the paths shown in the list files of the folder "data/list"

Preprocessing

  • Conduct similarity transformation for face images:
    • We provide the landmarks annotated using OpenPose for EmotioNet here. Each line in the landmark annotation file corresponds to 49 facial landmark locations (x1,y1,x2,y2...). Put these annotation files into the folder "dataset"
    • An example of processed image can be found in the folder "data/imgs/EmotioNet/optimization_set/N_0000000001/"
    cd dataset
    python face_transform.py
    
  • Compute the weight of the loss of each AU in the BP4D training set:
    • The AU annoatation files should be in the folder "data/list"
    cd dataset
    python write_AU_weight.py
    

Training

  • Train a model without using target-domain pseudo AU labels:
python train.py --mode='weak'
  • Train a model using target-domain pseudo AU labels:
python train.py --mode='full'

Testing

  • Test a model trained without using target-domain pseudo AU labels:
python test.py --mode='weak'
  • Test a model trained using target-domain pseudo AU labels:
python test.py --mode='full'

Citation

If you use this code for your research, please cite our paper:

@article{shao2021unconstrained,
  title={Unconstrained Facial Action Unit Detection via Latent Feature Domain},
  author={Shao, Zhiwen and Cai, Jianfei and Cham, Tat-Jen and Lu, Xuequan and Ma, Lizhuang},
  journal={IEEE Transactions on Affective Computing},
  year={2021},
  publisher={IEEE}
}

Should you have any questions, just contact with us through email zhiwen_shao@cumt.edu.cn

Related Skills

View on GitHub
GitHub Stars29
CategoryDevelopment
Updated11mo ago
Forks3

Languages

Python

Security Score

67/100

Audited on Apr 21, 2025

No findings