SkillAgentSearch skills...

TransformerCVAE

Transformer-based Conditional Variational Autoencoder for Controllable Story Generation

Install / Use

/learn @fangleai/TransformerCVAE
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

TransformerCVAE

This repository contains source code for paper Transformer-based Conditional Variational Autoencoder for Controllable Story Generation:

@article{fang2021transformer,
  title={Transformer-based Conditional Variational Autoencoder for Controllable Story Generation},
  author={Fang, Le and Zeng, Tao and Liu, Chaochun and Bo, Liefeng and Dong, Wen and Chen, Changyou},
  journal={arXiv preprint arXiv:2101.00828},
  year={2021}
}
  1. get source data (Arxiv, Yelp, WritingPrompts, WikiPlots).
  2. data pre-processing (data/).
  3. training (choose from several different implementations on parallelism and precision: train.py, train_dist.py, train_dist_half.py).
  4. generation, evaluation and analysis (generate.py/generate_prefix.py, eval_ppl.py/eval_ppl_prefix.py, tsne_plot.py).

Contact: lefang@buffalo.edu

Update on 2022: If you encounter package version issue, sorry for that I don't have a requirements.txt with exact versions. I used this package: https://github.com/nvidia/apex and an old pytorch version compatible with it at that time, say pytorch=0.4 (not 100% sure).

Related Skills

View on GitHub
GitHub Stars166
CategoryDevelopment
Updated1mo ago
Forks19

Languages

Python

Security Score

80/100

Audited on Mar 4, 2026

No findings