DeepDance
Code repo of the paper "DeepDance: Music-to-Dance Motion Choreography with Adversarial Learning"
Install / Use
/learn @computer-animation-perception-group/DeepDanceREADME
DeepDance: Music-to-Dance Motion Choreography with Adversarial Learning
This reop contains code of paper on Music2Dance generation: "DeepDance: Music-to-Dance Motion Choreography with Adversarial Learning". Project Page
Requirements
- A CUDA compatible GPU
- Ubuntu >= 14.04
Usage
-
Setup
Download this repo on your computer and create a new enviroment using commands as follows:
git clone https://github.com/computer-animation-perception-group/DeepDance.git conda create -n music_dance python==3.5 pip install -r requirement.txtPut your audio files (.wav) under "./dataset/music_feature/librosa/samples"
Extract low-level musical features using command as follows:
python music_feature_extract.pyRun the following command to generate dance sequences
sh generate_dance.shGenerated dances are in "training_results/motions". You can change folders of generated dances by changing last line of "generate_dance.sh".
-
Dataset
~~Datas will be released soon.~~
Our EA-MUD dataset is available now.
-
Training
~~Training code will be released soon.~~
Training code is available now.
-
Trained Models
Trained models of multiple dancing genres are on GoogleDrive.
Download these models and put them on "./training_results/models".
You can generate dance sequences of different genres by changing model_path in "generate_dance.sh".
-
Visualization
Open matlab and set path to "m2m_evaluation" folder and run csv_visualization.m
<img src="images/chacha.gif" width="300" height="300"> <img src="images/gudianwu.gif" width="300" height="300">
License
Licensed under an GPL v3.0 License.
Bibtex
@article{sun2020deepdance,
author={G. {Sun} and Y. {Wong} and Z. {Cheng} and M. S. {Kankanhalli} and W. {Geng} and X. {Li}},
journal={IEEE Transactions on Multimedia},
title={DeepDance: Music-to-Dance Motion Choreography with Adversarial Learning},
year={2021},
volume={23},
number={},
pages={497-509},}
Related Skills
groundhog
398Groundhog's primary purpose is to teach people how Cursor and all these other coding agents work under the hood. If you understand how these coding assistants work from first principles, then you can drive these tools harder (or perhaps make your own!).
isf-agent
a repo for an agent that helps researchers apply for isf funding
workshop-rules
Materials used to teach the summer camp <Data Science for Kids>
last30days-skill
13.4kAI agent skill that researches any topic across Reddit, X, YouTube, HN, Polymarket, and the web - then synthesizes a grounded summary
