SkillAgentSearch skills...

MolGen

[ICLR 2024] Domain-Agnostic Molecular Generation with Chemical Feedback

Install / Use

/learn @zjunlp/MolGen

README

<h1 align="center"> ⚗️ MolGen </h1> <h3 align="center"> Domain-Agnostic Molecular Generation with Chemical Feedback </h3> <p align="center"> 📃 <a href="https://arxiv.org/abs/2301.11259" target="_blank">Paper</a> • 🤗 <a href="https://huggingface.co/zjunlp/MolGen-large" target="_blank">Model</a> • 🔬 <a href="https://huggingface.co/spaces/zjunlp/MolGen" target="_blank">Space</a> <br> </p>

Pytorch license

<div align=center><img src="molgen.png" width="100%" height="100%" /></div>

🔔 News

📕 Requirements

To run the codes, You can configure dependencies by restoring our environment:

conda env create -f environment.yaml

and then:

conda activate my_env

📚 Resource Download

You can download the pre-trained and fine-tuned models via Huggingface: MolGen-large and MolGen-large-opt.

You can also download the model using the following link: https://drive.google.com/drive/folders/1Eelk_RX1I26qLa9c4SZq6Tv-AAbDXgrW?usp=sharing

Moreover, the dataset used for downstream tasks can be found here.

The expected structure of files is:

moldata
├── checkpoint 
│   ├── molgen.pkl              # pre-trained model
│   ├── syn_qed_model.pkl       # fine-tuned model for QED optimization on synthetic data
│   ├── syn_plogp_model.pkl     # fine-tuned model for p-logP optimization on synthetic data
│   ├── np_qed_model.pkl        # fine-tuned model for QED optimization on natural product data
│   ├── np_plogp_model.pkl      # fine-tuned model for p-logP optimization on natural product data
├── finetune
│   ├── np_test.csv             # nature product test data
│   ├── np_train.csv            # nature product train data
│   ├── plogp_test.csv          # synthetic test data for plogp optimization
│   ├── qed_test.csv            # synthetic test data for plogp optimization
│   └── zinc250k.csv            # synthetic train data
├── generate                    # generate molecules
├── output                      # molecule candidates
└── vocab_list
    └── zinc.npy                # SELFIES alphabet

🚀 How to run

  • Fine-tune

    • First, preprocess the finetuning dataset by generating candidate molecules using our pre-trained model. The preprocessed data will be stored in the folder output.
        cd MolGen
        bash preprocess.sh
    
    • Then utilize the self-feedback paradigm. The fine-tuned model will be stored in the folder checkpoint.
        bash finetune.sh
    
  • Generate

    To generate molecules, run this script. Please specify the checkpoint_path to determine whether to use the pre-trained model or the fine-tuned model.

    cd MolGen
    bash generate.sh
    

🥽 Experiments

We conduct experiments on well-known benchmarks to confirm MolGen's optimization capabilities, encompassing penalized logP, QED, and molecular docking properties. For detailed experimental settings and analysis, please refer to our paper.

  • MolGen captures real-word molecular distributions

<img width="950" alt="image" src="https://github.com/zjunlp/MolGen/assets/61076726/c32bf106-d43c-4d1d-af48-8caed03305bc">
  • MolGen mitigates molecular hallucinations

Targeted molecule discovery

<img width="480" alt="image" src="https://github.com/zjunlp/MolGen/assets/61076726/51533e08-e465-44c8-9e78-858775b59b4f"> <img width="595" alt="image" src="https://github.com/zjunlp/MolGen/assets/61076726/6f17a630-88e4-46f6-9cb1-9c3637a264fc"> <img width="376" alt="image" src="https://github.com/zjunlp/MolGen/assets/61076726/4b934314-5f23-4046-a771-60cdfe9b572d">

Constrained molecular optimization

<img width="350" alt="image" src="https://github.com/zjunlp/MolGen/assets/61076726/bca038cc-637a-41fd-9b53-48ac67c4f182">

Citation

If you use or extend our work, please cite the paper as follows:

@inproceedings{fang2023domain,
  author       = {Yin Fang and
                  Ningyu Zhang and
                  Zhuo Chen and
                  Xiaohui Fan and
                  Huajun Chen},
  title        = {Domain-Agnostic Molecular Generation with Chemical feedback},
  booktitle    = {{ICLR}},
  publisher    = {OpenReview.net},
  year         = {2024},
  url          = {https://openreview.net/pdf?id=9rPyHyjfwP}
}

Star History Chart

View on GitHub
GitHub Stars190
CategoryDevelopment
Updated5d ago
Forks19

Languages

Python

Security Score

100/100

Audited on Mar 30, 2026

No findings