Giter VIP home page Giter VIP logo

neural-barrier-function's Introduction

Verification-Aided Learning of NN Vector Barrier Functions

Verifying the safety of a complex dynamical system is a long-standing challenge due to the computational complexity and the need of domain knowledge. Can we automate the safety verification by learning a barrier function that separates the system's trajectory and the unsafe regions, while maintaining the formal guarantee of the learned certificate?

The answer is affirmative, and calls for a fusion of learning and verification tools. How to get the best from both learning and verification is the main problem considered in the paper:

More specifically, we want to find the best way to (1) handle verification failure and (2) retrain the NN safety certificate.

  • A straightforward method follows the "train-verify" procedure: we first learn a barrier function candidate and verify if it is valid. If the verification fails, we augment the training dataset with the counterexamples found and retrain the NN barrier function. However, we don't have much control over this process and the loop between training and verification may never terminate.
  • This repository implements a "train-finetune" procedure where the last linear layer of the NN is fine-tuned through a convex optimization algorithm that achieves strong convergence guarantees. The "train-finetune" framework also achieves boosted success rate in learning a formal NN safety certificate in practice.

Verification-Aided Learning Framework

Intuitively, we want to rely on machine learning to find a candidate solution that is close to being valid and close the final gap using optimization. The verification-aided learning framework consists of three main steps:

  1. Train a NN vector barrier function
  2. Fine-tune the last linear layer of the NN barrier function to achieve a barrier function with formal guarantee
  3. In case of search failure, generate counterexamples to augment the dataset and retrain the model using sequential learning techniques

The fine-tuning step follows a counterexample guided inductive synthesis (CEGIS) framework and can be considered as applying cutting-plane algorithms in the parameter space of the last layer's weights. A detailed explanation of the fine-tuning step can be found in the paper.

Numerical Examples

For a double integrator system controlled by a ReLU network controller, the following figure shows that the learned NN barrier function can effectively separate the system's trajectories and the unsafe region:

The following table shows that the proposed "train-finetune" framework achieves better success rate than the "train-verify" framework in learning a valid NN barrier function.

Codebase Organization

  • In the experiments, we apply a mixed-integer programming (MIP)-based NN verifier and alpha, beta-CROWN for testing. The implementation of the MIP-based verificaiton is based on Gurobi and can be found in ACCPM.py. Codes of alpha, beta-CROWN is included in the complete_verifier folder where complete_verification.py provides interfaces for verification of NN barrier functions.
  • Training of the NN barrier function and augmentation of the counterexamples are implemented in train_barrier.py.
  • The fine-tuning algorithm is based on the analytic center cutting-plane method and is implemented in ACCPM.py.

Installation

The following environment setup has been tested.

conda create --name test python==3.7.13
conda activate test

% install pytorch with a suitable version on your device, see (https://pytorch.org/get-started/previous-versions/). The following versions have been tested and recommended for use.

% CPU only
pip install torch==1.11.0+cpu torchvision==0.12.0+cpu torchaudio==0.11.0 --extra-index-url https://download.pytorch.org/whl/cpu

or 

% CUDA 11.3
pip install torch==1.11.0+cu113 torchvision==0.12.0+cu113 torchaudio==0.11.0 --extra-index-url https://download.pytorch.org/whl/cu113


% install packages
pip install -r requirements.txt
conda install -c conda-forge cvxpy

Run experiments

To run the experiments in the examples folder, run the following commands in the terminal:

cd examples/double_integrator_mip
python main_train_DI.py --config double_integrator.yaml --seed 0 --method fine-tuning

The experimental setup parameters are included in the yaml file with comments.

neural-barrier-function's People

Contributors

shaoruchen avatar

Watchers

 avatar

Forkers

dtbinh

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.