SkillAgentSearch skills...

Alignn

Atomistic Line Graph Neural Network https://scholar.google.com/citations?user=9Q-tNnwAAAAJ https://www.youtube.com/@dr_k_choudhary

Install / Use

/learn @usnistgov/Alignn
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

alt text codecov PyPI version GitHub tag (latest by date) GitHub code size in bytes GitHub commit activity Downloads

<!-- [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/atomistic-line-graph-neural-network-for/formation-energy-on-materials-project)](https://paperswithcode.com/sota/formation-energy-on-materials-project?p=atomistic-line-graph-neural-network-for) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/atomistic-line-graph-neural-network-for/band-gap-on-materials-project)](https://paperswithcode.com/sota/band-gap-on-materials-project?p=atomistic-line-graph-neural-network-for) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/atomistic-line-graph-neural-network-for/formation-energy-on-qm9)](https://paperswithcode.com/sota/formation-energy-on-qm9?p=atomistic-line-graph-neural-network-for) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/atomistic-line-graph-neural-network-for/formation-energy-on-jarvis-dft-formation)](https://paperswithcode.com/sota/formation-energy-on-jarvis-dft-formation?p=atomistic-line-graph-neural-network-for) [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/atomistic-line-graph-neural-network-for/band-gap-on-jarvis-dft)](https://paperswithcode.com/sota/band-gap-on-jarvis-dft?p=atomistic-line-graph-neural-network-for) -->

Table of Contents

<a name="intro"></a>

ALIGNN & ALIGNN-FF (Introduction)

The Atomistic Line Graph Neural Network (https://www.nature.com/articles/s41524-021-00650-1) introduces a new graph convolution layer that explicitly models both two and three body interactions in atomistic systems. This is achieved by composing two edge-gated graph convolution layers, the first applied to the atomistic line graph L(g) (representing triplet interactions) and the second applied to the atomistic bond graph g (representing pair interactions).

Atomisitic line graph neural network-based FF (ALIGNN-FF) (https://pubs.rsc.org/en/content/articlehtml/2023/dd/d2dd00096b ) can be used to model both structurally and chemically diverse systems with any combination of 89 elements from the periodic table, specially for structural optimization. To train the ALIGNN-FF model, we have used the JARVIS-DFT dataset which contains around 75000 materials and 4 million energy-force entries, out of which 307113 are used in the training. These models can be further finetuned, or new models can be developed from scratch on a new dataset.

ALIGNN layer schematic

<a name="install"></a> Installation

First create a conda environment: Install miniconda environment from https://conda.io/miniconda.html Based on your system requirements, you'll get a file something like 'Miniconda3-latest-XYZ'.

Now,

bash Miniconda3-latest-Linux-x86_64.sh (for linux)
bash Miniconda3-latest-MacOSX-x86_64.sh (for Mac)

Download 32/64 bit python 3.10 miniconda exe and install (for windows)

Method 1 (conda based installation, recommended)

Now, let's make a conda environment, say "my_alignn", choose other name as you like::

conda create --name my_alignn python=3.10 -y
conda activate my_alignn
conda install dgl=2.1.0 pytorch torchvision torchaudio pytorch-cuda -c pytorch -c nvidia
conda install alignn -y

Method 2 (GitHub based installation)

You can laso install a development version of alignn by cloning the repository and installing in place with pip:

conda create --name my_alignn python=3.10 -y
conda activate my_alignn
conda install dgl=2.1.0 pytorch torchvision torchaudio pytorch-cuda -c pytorch -c nvidia
git clone https://github.com/usnistgov/alignn
cd alignn
python -m pip install -e .

Method 3 (using pypi):

As an alternate method, ALIGNN can also be installed using pip. Note, we have received several messages regarding dgl installation issues. You can look into dgl installation here. Example for PyTorch 2.1+CUDA 12.1+Pip(Stable)+Windows:

pip install  -q dgl -f https://data.dgl.ai/wheels/torch-2.1/cu121/repo.html
pip install alignn

With no GPU/CUDA:

pip install -q dgl -f https://data.dgl.ai/wheels/torch-2.1/repo.html
pip install alignn

You can find out installation examples in Google Colab notebooks below

<a name="example"></a> Examples

| Notebooks | Google Colab | Descriptions | | ---------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Regression task (grpah wise prediction) | ![Open in Google Colab] | Examples for developing single output regression model for exfoliation energies of 2D materials. | | Machine learning force-field training from scratch | ![Open in Google Colab] | Examples of training a machine learning force field for Silicon. | | ALIGNN-FF Relaxer+EV_curve+Phonons+Interface gamma_surface+Interface separation | ![Open in Google Colab] | Examples of using pre-trained ALIGNN-FF force-field model. | | Scaling/timing comaprison | ![Open in Google Colab] | Examples of analyzing scaling

View on GitHub
GitHub Stars308
CategoryDevelopment
Updated3d ago
Forks114

Languages

Python

Security Score

80/100

Audited on Mar 31, 2026

No findings