M3gnet
Materials graph network with 3-body interactions featuring a DFT surrogate crystal relaxer and a state-of-the-art property predictor.
Install / Use
/learn @materialyzeai/M3gnetREADME
NOTE: A new implementation based on the Deep Graph Library and PyTorch called the Materials Graph Library (MatGL) has replaced this implementation. This repository has been archived and will no longer be maintained. It will be kept purely as a reference implementation. Users are recommended to use matgl instead.
M3GNet
M3GNet is a new materials graph neural network architecture that incorporates 3-body interactions. A key difference with prior materials graph implementations such as MEGNet is the addition of the coordinates for atoms and the 3×3 lattice matrix in crystals, which are necessary for obtaining tensorial quantities such as forces and stresses via auto-differentiation.
As a framework, M3GNet has diverse applications, including:
- Interatomic potential development. With the same training data, M3GNet performs similarly to state-of-the-art machine learning interatomic potentials (ML-IAPs). However, a key feature of a graph representation is its flexibility to scale to diverse chemical spaces. One of the key accomplishments of M3GNet is the development of a universal IAP that can work across the entire periodic table of the elements by training on relaxations performed in the Materials Project.
- Surrogate models for property predictions. Like the previous MEGNet architecture, M3GNet can be used to develop surrogate models for property predictions, achieving in many cases accuracies that better or similar to other state-of-the-art ML models.
For detailed performance benchmarks, please refer to the publication in the References section. The API documentation is available via the Github Page.
Table of Contents
- System requirements
- Installation
- Change Log
- Usage
- Model training
- Matterverse
- API docs
- Datasets
- References
System requirements
Inferences using the pre-trained models can be ran on any standard computer. For model training, the GPU memory needs to be > 18 Gb for a batch size of 32 using the crystal training data. In our work, we used a single RTX 3090 GPU for model training.
Installation
M3GNet can be installed via pip:
pip install m3gnet
You can also directly download the source from Github and install from source.
Apple Silicon Installation
Apple Silicon (M1, M1 Pro, M1 Max, M1 Ultra) has extremely powerful ML capabilities, but special steps are needed for the installation of tensorflow and other dependencies. Here are the recommended installation steps.
-
Ensure that you already have XCode and CLI installed.
-
Install Miniconda or Anaconda.
-
Create a Python 3.9 environment.
conda create --name m3gnet python=3.9 conda activate m3gnet -
First install tensorflow and its dependencies for Apple Silicon.
conda install -c apple tensorflow-deps pip install tensorflow-macos -
If you wish, you can install
tensorflow-metal, which helps speed up training. If you encounter strange tensorflow errors, you should uninstalltensorflow-metaland see if it fixes the errors first.pip install tensorflow-metal -
Install m3gnet but ignore dependencies (otherwise, pip will look for tensorflow).
pip install --no-deps m3gnet -
Install other dependencies like pymatgen, etc. manually.
pip install protobuf==3.20.0 pymatgen ase cython -
Once you are done, you can try running
pytest m3gnetto see if all tests pass.
Change Log
See change log
Usage
Structure relaxation
A M3Gnet universal potential for the periodic table has been developed using data from Materials Project relaxations since 2012. This universal potential can be used to perform structural relaxation of any arbitrary crystal as follows.
import warnings
from m3gnet.models import Relaxer
from pymatgen.core import Lattice, Structure
for category in (UserWarning, DeprecationWarning):
warnings.filterwarnings("ignore", category=category, module="tensorflow")
# Init a Mo structure with stretched lattice (DFT lattice constant ~ 3.168)
mo = Structure(Lattice.cubic(3.3), ["Mo", "Mo"], [[0., 0., 0.], [0.5, 0.5, 0.5]])
relaxer = Relaxer() # This loads the default pre-trained model
relax_results = relaxer.relax(mo, verbose=True)
final_structure = relax_results['final_structure']
final_energy_per_atom = float(relax_results['trajectory'].energies[-1] / len(mo))
print(f"Relaxed lattice parameter is {final_structure.lattice.abc[0]:.3f} Å")
print(f"Final energy is {final_energy_per_atom:.3f} eV/atom")
The output is as follows:
Relaxed lattice parameter is 3.169 Å
Final energy is -10.859 eV/atom
The initial lattice parameter of 3.3 Å was successfully relaxed to 3.169 Å, close to the DFT value of 3.168 Å. The final energy -10.859 eV/atom is also close to Materials Project DFT value of -10.8456 eV/atom.
The relaxation takes less than 20 seconds on a single laptop.
The table below provides more comprehensive benchmarks for cubic crystals based on exp data on Wikipedia and MP DFT data. The Jupyter notebook is in the examples folder. This benchmark is limited to cubic crystals for ease of comparison since there is only one lattice parameter. Of course, M3GNet is not limited to cubic systems (see LiFePO4 example).
| Material | Crystal structure | a (Å) | MP a (Å) | M3GNet a (Å) | % error vs Expt | % error vs MP | | :---------- | :---------------- | ------: | -------: | -----------: | :-------------- | :------------ | | Ac | FCC | 5.31 | 5.66226 | 5.6646 | 6.68% | 0.04% | | Ag | FCC | 4.079 | 4.16055 | 4.16702 | 2.16% | 0.16% | | Al | FCC | 4.046 | 4.03893 | 4.04108 | -0.12% | 0.05% | | AlAs | Zinc blende (FCC) | 5.6605 | 5.73376 | 5.73027 | 1.23% | -0.06% | | AlP | Zinc blende (FCC) | 5.451 | 5.50711 | 5.50346 | 0.96% | -0.07% | | AlSb | Zinc blende (FCC) | 6.1355 | 6.23376 | 6.22817 | 1.51% | -0.09% | | Ar | FCC | 5.26 | 5.64077 | 5.62745 | 6.99% | -0.24% | | Au | FCC | 4.065 | 4.17129 | 4.17431 | 2.69% | 0.07% | | BN | Zinc blende (FCC) | 3.615 | 3.626 | 3.62485 | 0.27% | -0.03% | | BP | Zinc blende (FCC) | 4.538 | 4.54682 | 4.54711 | 0.20% | 0.01% | | Ba | BCC | 5.02 | 5.0303 | 5.03454 | 0.29% | 0.08% | | C (diamond) | Diamond (FCC) | 3.567 | 3.57371 | 3.5718 | 0.13% | -0.05% | | Ca | FCC | 5.58 | 5.50737 | 5.52597 | -0.97% | 0.34% | | CaVO3 | Cubic perovskite | 3.767 | 3.83041 | 3.83451 | 1.79% | 0.11% | | CdS | Zinc blende (FCC) | 5.832 | 5.94083 | 5.9419 | 1.88% | 0.02% | | CdSe | Zinc blende (FCC) | 6.05 | 6.21283 | 6.20987 | 2.64% | -0.05% | | CdTe | Zinc blende (FCC) | 6.482 | 6.62905 | 6.62619 | 2.22% | -0.04% | | Ce | FCC | 5.16 | 4.72044 | 4.71921 | -8.54% | -0.03% | | Cr | BCC | 2.88 | 2.87403 | 2.84993 | -1.04% | -0.84% | | CrN | Halite | 4.149 | - | 4.16068 | 0.28% | - | | Cs | BCC | 6.05 | 6.11004 | 5.27123 | -12.87% | -13.73% | | CsCl | Caesium chloride | 4.123 | 4.20906 | 4.20308 | 1.94% | -0.14% | | CsF | Halite | 6.02 | 6.11801 | 6.1265 | 1.77% | 0.14% | | CsI | Caesium chloride | 4.567 | 4.66521 | 4.90767 | 7.46% | 5.20% | | Cu | FCC | 3.597 | 3.62126 | 3.61199 | 0.42% | -0.26% | | Eu | BCC | 4.61 | 4.63903 | 4.34783 | -5.69% | -6.28% | | EuTiO3 | Cubic perovskite | 7.81 | 3.96119 | 3.92943 | -49.69% | -0.80% | | Fe | BCC | 2.856 | 2.84005 | 2.85237 | -0.13% | 0.43% | | GaAs | Zinc blende (FCC) | 5.653 | 5.75018 | 5.75055 | 1.73% | 0.01% | | GaP | Zinc blende (FCC) | 5.4505 | 5.5063 | 5.5054 | 1.01% | -0.02% | | GaSb | Zinc blende (FCC) | 6.0959 | 6.21906 | 6.21939 | 2.03% | 0.01% | |
Related Skills
YC-Killer
2.7kA library of enterprise-grade AI agents designed to democratize artificial intelligence and provide free, open-source alternatives to overvalued Y Combinator startups. If you are excited about democratizing AI access & AI agents, please star ⭐️ this repository and use the link in the readme to join our open source AI research team.
API
A learning and reflection platform designed to cultivate clarity, resilience, and antifragile thinking in an uncertain world.
groundhog
398Groundhog's primary purpose is to teach people how Cursor and all these other coding agents work under the hood. If you understand how these coding assistants work from first principles, then you can drive these tools harder (or perhaps make your own!).
sec-edgar-agentkit
10AI agent toolkit for accessing and analyzing SEC EDGAR filing data. Build intelligent agents with LangChain, MCP-use, Gradio, Dify, and smolagents to analyze financial statements, insider trading, and company filings.
