Giter VIP home page Giter VIP logo

alebrew's Introduction

ALEBREW: The Atomic Learning Environment for Building REliable interatomic neural netWork potentials

Official repository for the paper "Uncertainty-biased molecular dynamics for learning uniformly accurate interatomic potentials". It comprises the package for building uniformly accurate machine-learned interatomic potentials using uncertainty-biased molecular dynamics and active learning. Data sets generated using this package and the configuration files necessary to reproduce the main results of the paper can be found at Zenodo. The logo for ALEBREW has been generated using DALL-E.

ALEBREW

Citing us

Please consider citing us if you find the code and paper useful:

@misc{zaverkin2023uncertaintybiased,
  title={Uncertainty-biased molecular dynamics for learning uniformly accurate interatomic potentials}, 
  author={Viktor Zaverkin and David Holzmüller and Henrik Christiansen and Federico Errica and Francesco Alesiani and Makoto Takamoto and Mathias Niepert and Johannes Kästner},
  year={2023},
  eprint={2312.01416},
  archivePrefix={arXiv},
  primaryClass={physics.comp-ph}
}

Implemented methods

This repository implements

  • Gaussian moment neural network potentials;
  • Total and atom-based uncertainties for biasing and terminating atomistic simulations;
  • Ensemble-free last-layer and sketched gradient-based features to compute posterior- and distance-based uncertainties;
  • Ensemble-based uncertainties;
  • Batch selection strategies to generate maximally diverse training data sets with active learning;
  • Uncertainty calibration methods. We recommend to use inductive conformal prediction in the experiments;
  • Interface with ASE to run molecular dynamics simulations (NVT, NPT, etc.) and reference calculations;
  • Adversarial training to use with last-layer and sketched gradient-based uncertainties.

License

This source code has a non-commercial license; see LICENSE.txt for more details.

Requirements

An environment with PyTorch (>=2.0.0), BMDAL-REG (version 3) and ASE (>=3.22.1) installed. Also, some other dependencies may be necessary; see alebrew-{cuda,cpu,mps}.yml files.

Installation

First, clone this repository into a directory of your choice git clone https://github.com/nec-research/alebrew.git <dest_dir>. Then, move to <dest_dir> and install the required packages into a conda environment using, e.g., conda env create -f alebrew-{cuda,cpu,mps}.yml. Finally, you can set your PYTHONPATH environment variable to export PYTHONPATH=<dest_dir>:$PYTHONPATH.

Training potentials with a data set

Currently, we implement only Gaussian moment neural network potentials. You have to specify model parameters in the config.yaml file to train them using a data set. The default config.yaml.default file provides an example for training an interatomic potential for alanine dipeptide using the test data set from Zenodo.

To use the default configuration, first, download the ala2_test.extxyz from Zenodo and execute cp config.yaml.default config.yaml. Then, by running python scripts/run_train.py config.yaml, the training of a Gaussian moment neural network model starts, with the progress displayed in the models/ala2/train.out file; use, e.g., tail -f models/ala2/train.out. After training, potentials can be evaluated by running python scripts/run_test.py config.yaml with the test results evaluated on the data not used during training and stored in the models/ala2/test_results.json file.

Generating data sets with ALEBREW from scratch

For the more complicated task of generating training data sets and training machine-learned interatomic potentials from scratch using our active learning workflow, we provide a more detailed tutorial in the examples/using_alebrew.ipynb Jupyter notebook.

How to reproduce the results from the paper

To reproduce the results from the paper replace the parameters in the provided Jupyter notebook, i.e., examples/using_alebrew.ipynb, by those presented in the config.yaml files from {ala2-ffs,mil53}/{task}/{learning_method} at Zenodo. See also docstrings in alebrew/task_execuption.py for more information on possible parameters.

alebrew's People

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.