Giter VIP home page Giter VIP logo

iclr2024-frond's Introduction

This repository contains the code for our ICLR 2024 accepted Spotlight paper, Unleashing the Potential of Fractional Calculus in Graph Neural Networks with FROND.

Table of Contents

Requirements

To install the required dependencies, refer to the environment.yaml file

Reproducing Results

To run our code, go to the /src folder.

python run_GNN_frac_all.py 
--dataset  Cora, Citeseer, Pubmed, CoauthorCS, CoauthorPhy, Computers, Photo
--function laplacian/ transformer
--block constant_frac/ att_frac
--method predictor/ predictor_corrector
--alpha_ode  between (0,1] the value of beta in the paper
--time     integration time
--step_size  

FOR EXAMPLE:

run_GNN_frac_all.py --dataset Cora --function laplacian --block att_frac --cuda 1 --method predictor --epoch 400 --seed 123 --runtime 10 --decay 0.01 --dropout 0.2 --hidden_dim 256 --input_dropout 0.6 --alpha_ode 0.85 --time 40 --step_size 1.0 --lr 0.01

Reference

Our code is developed based on the following repo:

The FDE solver is from torchfde.

The graph neural ODE model is based on the GRAND, GraphCON, and GraphCDE framework.

Citation

If you find our work useful, please cite us as follows:

@INPROCEEDINGS{KanZhaDin:C24,
    author = {Qiyu Kang and Kai Zhao and Qinxu Ding and Feng Ji and Xuhao Li and Wenfei Liang and Yang Song and Wee Peng Tay},
    title={Unleashing the Potential of Fractional Calculus in Graph Neural Networks with {FROND}},
    booktitle={Proc. International Conference on Learning Representations},
    year={2024},
    address = {Vienna, Austria},
    note ={\textbf{spotlight}},
}

iclr2024-frond's People

Contributors

zknus avatar

Stargazers

Jeongwhan Choi avatar gruebleen avatar yyh_hfut avatar Dow avatar Do Phuc Hao avatar Wee Peng Tay avatar

Watchers

Kostas Georgiou avatar  avatar

iclr2024-frond's Issues

Details of env

Hi,

I'm trying to set up the environment to run your code and learn more about it. I'm encountering errors during installation, specifically related to the llvm-openmp package.

Could you please provide some information about the environment used for developing the code? This might include:

Operating System: (e.g., Windows 10, Ubuntu 18.04)
Python Version: (e.g., Python 3.7)
Conda environment details: (If a specific conda environment was used)
Versions of key dependencies: (Particularly llvm-openmp if known)
This information would be very helpful in troubleshooting the installation issues I'm facing.

Thank you for your time and for sharing your work!

Sincerely,
Mohsen

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.