Mxfold2
MXfold2: RNA secondary structure prediction using deep learning with thermodynamic integration
Install / Use
/learn @mxfold/Mxfold2README
MXfold2
RNA secondary structure prediction using deep learning with thermodynamic integration
Installation
System requirements
- python (>=3.7)
- pytorch (>=1.4)
- C++17 compatible compiler (tested on Apple clang version 12.0.0 and GCC version 7.4.0) (optional)
Install from wheel
We provide the wheel python packages for several platforms at the release. You can download an appropriate package and install it as follows:
% pip3 install mxfold2-0.1.2-cp310-cp310-manylinux_2_17_x86_64.whl
Install from sdist
You can build and install from the source distribution downloaded from the release as follows:
% pip3 install mxfold2-0.1.2.tar.gz
To build MXfold2 from the source distribution, you need a C++17 compatible compiler.
Prediction
You can predict RNA secondary structures of given FASTA-formatted RNA sequences like:
% mxfold2 predict test.fa
>DS4440
GGAUGGAUGUCUGAGCGGUUGAAAGAGUCGGUCUUGAAAACCGAAGUAUUGAUAGGAAUACCGGGGGUUCGAAUCCCUCUCCAUCCG
(((((((........(((((..((((.....))))...)))))...................(((((.......)))))))))))). (24.8)
By default, MXfold2 employs the parameters trained from TrainSetA and TrainSetB (see our paper).
We provide other pre-trained models used in our paper. You can download models-0.1.0.tar.gz and extract the pre-trained models from it as follows:
% tar -zxvf models-0.1.0.tar.gz
Then, you can predict RNA secondary structures of given FASTA-formatted RNA sequences like:
% mxfold2 predict @./models/TrainSetA.conf test.fa
>DS4440
GGAUGGAUGUCUGAGCGGUUGAAAGAGUCGGUCUUGAAAACCGAAGUAUUGAUAGGAAUACCGGGGGUUCGAAUCCCUCUCCAUCCG
(((((((.((....))...........(((((.......))))).(((((......))))).(((((.......)))))))))))). (24.3)
Here, ./models/TrainSetA.conf specifies a lot of parameters including hyper-parameters of DNN models.
Training
MXfold2 can train its parameters from BPSEQ-formatted RNA sequences. You can also download the datasets used in our paper at the release.
% mxfold2 train --model MixC --param model.pth --save-config model.conf data/TrainSetA.lst
You can specify a lot of model's hyper-parameters. See mxfold2 train --help. In this example, the model's hyper-parameters and the trained parameters are saved in model.conf and model.pth, respectively.
Web server
A web server is working at http://www.dna.bio.keio.ac.jp/mxfold2/.
References
- Sato, K., Akiyama, M., Sakakibara, Y.: RNA secondary structure prediction using deep learning with thermodynamic integration. Nat Commun 12, 941 (2021). https://doi.org/10.1038/s41467-021-21194-4
Related Skills
proje
Interactive vocabulary learning platform with smart flashcards and spaced repetition for effective language acquisition.
YC-Killer
2.7kA library of enterprise-grade AI agents designed to democratize artificial intelligence and provide free, open-source alternatives to overvalued Y Combinator startups. If you are excited about democratizing AI access & AI agents, please star ⭐️ this repository and use the link in the readme to join our open source AI research team.
best-practices-researcher
The most comprehensive Claude Code skills registry | Web Search: https://skills-registry-web.vercel.app
groundhog
398Groundhog's primary purpose is to teach people how Cursor and all these other coding agents work under the hood. If you understand how these coding assistants work from first principles, then you can drive these tools harder (or perhaps make your own!).
