This Python package facilitates the construction of NEP using active learning techniques. It employs the same strategies as MTP [J. Chem. Phys. 159, 084112 (2023)] and ACE [Phys. Rev. Materials 7, 043801 (2023)].
The package automates active learning by submitting and monitoring jobs. The workflow is as follows:
If this is the first iteration and no structures are available, you can generate simple structures. For instance, you can perturb a crystal structure to create an initial training set.
If a train.xyz
file (containing energy and force data) already exists before the first iteration, this step will be skipped.
Note: The initial training set should not be too small for active set selection. For a NEP with a 30×30 neural network, approximately 1000 local environments (e.g., 10 structures with 100 atoms each) are required. If your NEP includes N elements, you will need N × 1000 local environments for each element type.
The MaxVol
algorithm is used to select an active set, which consists of reference environments for calculating the extrapolation level.
Train the NEP using the structures from Step 1. If a nep.txt
file is provided, this step will be skipped in the first iteration.
Run GPUMD to actively select new structures based on an extrapolation level cutoff. If no structures are generated, the loop terminates. You can run multiple MD simulations under different conditions to accelerate exploration.
While the structures from Step 4 have high extrapolation levels, they may lack diversity. Therefore, another MaxVol
selection is performed to identify the most representative structures.
First, install the required dependencies, including PyNEP
and CuPy
. Refer to this page for detailed instructions.
-
Clone the code by running:
git clone --recursive https://github.com/psn417/nep_maker.git
-
For Bash users:
- Open
~/.bashrc
in a text editor. - Add the following line at the end of the file:
export PYTHONPATH=$PYTHONPATH:/path/to/my_packages
- Save the file and reload the shell configuration:
source ~/.bashrc
- Open
-
Place the
nep_maker
package into themy_packages
directory to make it accessible to Python. -
Verify the installation by opening Python and typing
import nep_maker
. If the import is successful, the package is installed correctly.
Prepare the following input files:
init_structures.xyz
: Structures used to train the NEP in the first iteration.- (Optional)
train.xyz
andnep.txt
: If you already have an NEP, you can perform active learning based on it. run.in
: Specifies how to train the NEP.- Job Scripts: Define how to run NEP, GPUMD, and other tasks. Examples are provided for
LSF
; modify them according to your job system. vasp.yaml
: Specifies VASP parameters and pseudopotentials. Include the location of pseudopotentials. Refer to the ASE documentation for more details.run_active.py
: The main script for the active learning process. Only one CPU core is needed for it.submit_command.sh
: Specifies how to submit your jobs. ForLSF
, it isbsub < job.sh
. Forslurm
, it issbatch job.sh
.
An example folder, example_Na
, is provided with all necessary files.