We present AlphaNet, a local frame-based equivariant model designed to tackle the challenges of achieving both accurate and efficient simulations for atomistic systems. AlphaNet enhances computational efficiency and accuracy by leveraging the local geometric structures of atomic environments through the construction of equivariant local frames and learnable frame transitions. Notably, AlphaNet offers one of the best trade-offs between computational efficiency and accuracy among existing models. Moreover, AlphaNet exhibits scalability across a broad spectrum of system and dataset sizes, affirming its versatility.
-
Create a Conda Environment
Open your terminal or command prompt and run:
conda create -n alphanet_env python=3.8 #or later version
-
Activate the Environment
conda activate alphanet_env
-
Install Required Packages
Navigate to your desired installation directory and run:
pip install -r requirements.txt
-
Clone the Repository
git clone https://github.com/yourusername/AlphaNet.git
-
Install AlphaNet
Navigate into the cloned repository and install AlphaNet in editable mode:
cd AlphaNet pip install -e .
This allows you to make changes to the codebase and have them reflected without reinstalling the package.
The settings are put into a config file, you can see the json files provided as example, or see comments in alphanet/config.py
for some help.
Our code is based on pytorch-lightning, you can try a quick run by:
python mul_train.py
To prepare dataset in format of pickle, you can use:
- from deepmd:
python scripts/dp2pic_batch.py
- from extxyz:
python scripts/xyz2pic.py
To convert lightning formatted checkpoint to common state dict file:
python scripts/pl2ckpt.py
You can also freeze the model for inference:
python scripts/jit_compile.py
Once you have a converted checkpoint, you can evaluate it and plot it out:
python test.py --config path/to/config --ckpt path/to/ckpt
There is also an ase calculator:
from alphanet.infer.calc import AlphaNetCalculator
The Defected Bilayer Graphene Dataset
The Formate Decomposition on Cu Dataset
The models pretrained on OC2M and MPtrj are nearly ready for release, so you won’t have to wait much longer. Additionally, we are actively planning the release of other pretrained models in the near future.
This model is currently ranked on the leaderboard of Matbench Discovery. It consists of approximately 16.2 million parameters.
The following resources are available in the directory:
- Model Configuration: mp.json
- Model
state_dict
: Pre-trained weights can be downloaded from Figshare.
Path: pretrained_models/MPtrj
PS:There are still some problems we need to solve: 1: imporve the smoothness of the model, 2: maybe back to small size?
This project is licensed under the GNU License - see the LICENSE file for details.
We thank all contributors and the community for their support.
AlphaNet: Scaling Up Local Frame-based Interatomic Potential