Giter VIP home page Giter VIP logo

vita-group / unified-lth-gnn Goto Github PK

View Code? Open in Web Editor NEW
59.0 11.0 15.0 18.49 MB

[ICML 2021] "A Unified Lottery Tickets Hypothesis for Graph Neural Networks", Tianlong Chen*, Yongduo Sui*, Xuxi Chen, Aston Zhang, Zhangyang Wang

Home Page: https://tianlong-chen.github.io/about/

License: MIT License

Python 100.00%
lottery-ticket-hypothesis graph-neural-networks co-design co-optimization graph-sparsification pruning

unified-lth-gnn's Introduction

A Unified Lottery Tickets Hypothesis for Graph Neural Networks

License: MIT

[ICML 2021] A Unified Lottery Tickets Hypothesis for Graph Neural Networks

Tianlong Chen*, Yongduo Sui*, Xuxi Chen, Aston Zhang, Zhangyang Wang

Overview

Summary of our achieved performance (y-axis) at different graph and GNN sparsity levels (x-axis) on Cora and Citeceer node classification. The size of markers represent the inference MACs (= 0.5 FLOPs) of each sparse GCN on the corresponding sparsified graphs. Black circles indicate the baseline, i.e., unpruned dense GNNs on the full graph. Blue circles are random pruning results. Orange circles represent the performance of a previous graph sparsification approach, i.e., ADMM. Red stars are established by our method (UGS).


Methodlody

Detials are refer to our Paper.

Implementation

Node classification on Cora, Citeseer, PubMed

Refer to README

Link Prediction on Cora, Citeseer, PubMed

Refer to README

Experiments on OGB Datasets

Refer to Ogbn_ArXiv (README)

Refer to Ogbn_Proteins (README)

Refer to Ogbn_Collab (README)

Citation

@misc{chen2021unified,
      title={A Unified Lottery Ticket Hypothesis for Graph Neural Networks}, 
      author={Tianlong Chen and Yongduo Sui and Xuxi Chen and Aston Zhang and Zhangyang Wang},
      year={2021},
      eprint={2102.06790},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

Acknowledgement

https://github.com/Shen-Lab/SS-GCNs

https://github.com/cmavro/Graph-InfoClust-GIC

https://github.com/lightaime/deep_gcns_torch

unified-lth-gnn's People

Contributors

tianlong-chen avatar yongduosui avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

unified-lth-gnn's Issues

Question about sparsity level

Hi! This is an excellent paper. I had a question about the sparsity level reported in the paper. That is, Table 2 comparing GLT vs Random GLT on Cora dataset. Is this the same sparsity value printed while running the code like this below? Or are the values reported in the paper 1-sparsity ?

syd : Sparsity:[1], Best Val:[78.80] at epoch:[20] | Final Test Acc:[79.70] Adj:[99.99%] Wei:[79.89%]

The code calculates the amount of non-zeros -

   wei_spar = weight_nonzero * 100 / weight_total
    #wei_spar2 = 1-wei_spar
    print("-" * 100)
    print("Sparsity: Wei:[{:.2f}%]".format(wei_spar))
    print("-" * 100)

Thank you in advance for your time!

Dictionary error when training on OGB datasets

Hi!
I am trying to run the pruning for OGB datasets. I keep getting this error. Any idea how to resolve this? It runs for 1 epoch for 10 subgraphs and then I get this following error -

File "/home/relationalml/Desktop/Adarsh/GLT/Pruning/OGBN_proteins/unify/ogb/ogbn_proteins/main_imp.py", line 227, in <module>
    final_state_dict = main_get_mask(args, imp_num, resume_train_ckpt)
  File "/home/relationalml/Desktop/Adarsh/GLT/Pruning/OGBN_proteins/unify/ogb/ogbn_proteins/main_imp.py", line 178, in main_get_mask
    final_state_dict = pruning.save_all(dataset, 
  File "/home/relationalml/Desktop/Adarsh/GLT/Pruning/OGBN_proteins/unify/ogb/ogbn_proteins/pruning.py", line 73, in save_all
    'optimizer_state_dict': optimizer.state_dict()
  File "/home/relationalml/anaconda3/envs/gnnlottery/lib/python3.10/site-packages/torch/optim/optimizer.py", line 175, in state_dict
    packed_state = {(param_mappings[id(k)] if isinstance(k, torch.Tensor) else k): v
  File "/home/relationalml/anaconda3/envs/gnnlottery/lib/python3.10/site-packages/torch/optim/optimizer.py", line 175, in <dictcomp>
    packed_state = {(param_mappings[id(k)] if isinstance(k, torch.Tensor) else k): v
KeyError: 140292191978176

Why both DGL and PyG?

Hi, it seems that you have used DGL for comparatively smaller graphs (Cora, Citeseer, and PubMed) and PyTorch Geometric for graphs. Is there any specific reason for that? Please advise me if I got it wrong.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.