Giter VIP home page Giter VIP logo

iclr2020-padgn's Introduction

Physics-aware Difference Graph Networks for Sparsely-Observed Dynamics

This repository is the official PyTorch implementation of "Physics-aware Difference Graph Networks for Sparsely-Observed Dynamics".

Physics-aware Difference Graph Networks for Sparsely-Observed Dynamics

Sungyong Seo*, Chuizheng Meng*, Yan Liu, Physics-aware Difference Graph Networks for Sparsely-Observed Dynamics, ICLR 2020.

Data

Download the requried data.zip from Google Drive. Then,

cd /path/to/the/root/of/project
mkdir data
mv /path/to/data.zip ./data/
cd data
unzip data.zip

Environment

Docker (Recommended!)

First follow the official documents of Docker and nvidia-docker to install docker with CUDA support.

Use the following commands to build a docker image containing all necessary packages:

cd docker
bash build_docker.sh

This script will also copy the jupyter_notebook_config.py, which is the configuration file of Jupyter Notebook, into the docker image. The default password for Jupyter Notebook is 12345.

Use the following script to create a container from the built image:

bash rundocker-melady.sh

If the project directory is not under your home directory, modify rundocker-melady.sh to change the file mapping.

Manual Installation

# install python packages
pip install pyyaml tensorboardX geopy networkx tqdm
conda install pytorch==1.1.0 torchvision==0.2.2 cudatoolkit=9.0 -c pytorch
conda install -y matplotlib scipy pandas jupyter scikit-learn geopandas
conda install -y -c conda-forge jupyterlab igl meshplot

# install pytorch_geometric
export PATH=/usr/local/cuda/bin:$PATH
export CPATH=/usr/local/cuda/include:$CPATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH
pip install --verbose --no-cache-dir torch-scatter==1.2.0
pip install --verbose --no-cache-dir torch-sparse==0.4.0
pip install --verbose --no-cache-dir torch-cluster==1.3.0
pip install --verbose --no-cache-dir torch-spline-conv==1.1.0
pip install torch-geometric==1.1.2

# specify numpy==1.16.2 to avoid loading error (>=1.16.3 may require allow_pickle=True in np.load)
pip install -I numpy==1.16.2 

Run

Experiments in Section 3.1 "Approximation of Directional Derivatives"

See the Jupyter Notebook approx-gradient/synthetic-gradient-approximation.ipynb for details.

Experiments in Section 3.2 "Graph Signal Prediction" and Section 4 "Prediction: Graph Signals on Land-based Weather Stations"

cd scripts
python train.py --extconf /path/to/exp/config/file --mode train --device cuda:0

Examples:

  • PA-DGN, Graph Signal Prediction of Synthetic Data
cd scripts
python train.py --extconf ../confs/iclrexps/irregular_varicoef_diff_conv_eqn_4nn_42_250sample/GraphPDE_GN_sum_notshared_4nn/conf.yaml --mode train --device cuda:0
  • PA-DGN, Prediction of Graph Signals on Land-based Weather Stations
cd scripts
python train.py --extconf ../confs/iclrexps/noaa_pt_states_withloc/GraphPDE_GN_RGN_16_notshared_4nn/conf.yaml --mode train --device cuda:0
  • PA-DGN, Sea Surface Temperature (SST) Prediction
cd scripts
python train.py --extconf ../confs/iclrexps/sst-daily_4nn_42_250sample/GraphPDE_GN_sum_notshared_4nn/conf.yaml --mode train --device cuda:0

Summary of Results

You can use results/print_results.ipynb to print tables of experiment results, including the mean value and the standard error of mean absolution error (MAE) of prediction tasks.

Reference

@inproceedings{seo*2020physicsaware,
title={Physics-aware Difference Graph Networks for Sparsely-Observed Dynamics},
author={Sungyong Seo* and Chuizheng Meng* and Yan Liu},
booktitle={International Conference on Learning Representations},
year={2020},
url={https://openreview.net/forum?id=r1gelyrtwH}
}

iclr2020-padgn's People

Contributors

mengcz13 avatar

Stargazers

 avatar Romit Maulik avatar Sarth Dubey avatar CoderPanda avatar  avatar NedChen avatar xxxddd avatar  avatar  avatar 张弛 avatar Federico Berto avatar Junyoung Park avatar Chun Huang avatar  avatar Ninja avatar  avatar  avatar BingLiu avatar 爱可可-爱生活 avatar Cheng-Jun Wang avatar  avatar  avatar Haoyu Tang avatar Farley Lai avatar Jeongwhan Choi avatar Xuan Peng avatar  avatar  avatar XCML avatar John Jiang avatar Jiaxin Zhang avatar Hao Lyu avatar zhouzhou avatar  avatar rxzfn avatar  avatar  avatar Shuo Wang avatar Yavar Taheri Yeganeh avatar Shell Hu avatar Angran Li avatar Arka Sadhu avatar Jinwoo Choi avatar  avatar  avatar Michael Tsang avatar YaGuang avatar Sungyong Seo avatar  avatar

Watchers

James Cloos avatar Zhengping Che avatar YaGuang avatar Sungyong Seo avatar Yan Liu avatar Nitin Kamra avatar  avatar paper2code - bot avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.