Giter VIP home page Giter VIP logo

chainer-chemistry's Introduction

Chainer Chemistry: A Library for Deep Learning in Biology and Chemistry

PyPI GitHub license travis Read the Docs

Chainer Chemistry Overview

Chainer Chemistry is a deep learning framework (based on Chainer) with applications in Biology and Chemistry. It supports various state-of-the-art models (especially GCNN - Graph Convolutional Neural Network) for chemical property prediction.

For more information, please refer to the documentation. Also, a quick introduction to deep learning for molecules and Chainer Chemistry is available here.

Dependencies

Chainer Chemistry depends on the following packages:

These are automatically added to the system when installing the library via the pip command (see Installation). However, the following needs to be installed manually:

Please refer to the RDKit documentation for more information regarding the installation steps.

Note that only the following versions of Chainer Chemistry's dependencies are currently supported:

Chainer Chemistry Chainer RDKit Python
v0.1.0 ~ v0.3.0 v2.0 ~ v3.0 2017.09.3.0 2.7, 3.5, 3.6
v0.4.0 v3.0 ~ v4.0 *1 2017.09.3.0 2.7, 3.5, 3.6
v0.5.0 v3.0 ~ v5.0 *2 2017.09.3.0 2.7, 3.5, 3.6
v0.6.0 v6.0 ~ *3 2017.09.3.0 2.7, 3.5, 3.6
v0.7.0 ~ v0.7.1 v7.0 ~ 2019.03.2.0 3.6, 3.7 *4
master branch *5 v7.0 ~ 2019.03.2.0 3.6, 3.7

[Footnote]

*1: We used FunctionNode in this PR, which is introduced after chainer v3. See this issue for details.

*2: Saliency modules only work after chainer v5.

*3: Chainer v6 is released and ChainerX is newly introduced. In order to support this new feature & API, we broke backward compatibility for chainer chemistry v0.6.0 release. See ChainerX Documentation for details.

*4: python 2.x support is dropped, following the same policy with chainer and rdkit.

*5: As announced in chainer blog, further development will be limited to only serious bug-fixes and maintenance.

Installation

Chainer Chemistry can be installed using the pip command, as follows:

pip install chainer-chemistry

Example to install rdkit with conda:

# newer conda version is necessary to install rdkit 2019.03.2.0
conda install -n base conda==4.6.14
conda install -c rdkit rdkit==2019.03.2.0

If you would like to use the latest sources, please checkout the master branch and install with the following commands:

git clone https://github.com/pfnet-research/chainer-chemistry.git
pip install -e chainer-chemistry

Sample Code

Sample code is provided with this repository. This includes, but is not limited to, the following:

  • Training a new model on a given dataset
  • Performing inference on a given dataset, using a pretrained model
  • Evaluating and reporting performance metrics of different models on a given dataset

Please refer to the examples directory for more information.

Supported Models

The following graph convolutional neural networks are currently supported:

  • NFP: Neural Fingerprint [2, 3]
  • GGNN: Gated Graph Neural Network [4, 3]
  • WeaveNet [5, 3]
  • SchNet [6]
  • RSGCN: Renormalized Spectral Graph Convolutional Network [10]
    * The name is not from the original paper - see PR #89 for the naming convention.
  • RelGCN: Relational Graph Convolutional Network [14]
  • GAT: Graph Attention Networks [15]
  • GIN: Graph Isomorphism Networks [17]
  • MPNN: Message Passing Neural Networks [3]
  • Set2Set [19]
  • GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation [20]
  • MEGNet: MatErials Graph Network [24]
  • CGCNN: Crystal Graph Convolutional Neural Networks [25]

We test supporting the brand-new Graph Warp Module (GWM) [18]-attached models for:

  • NFP ('nfp_gwm')
  • GGNN ('ggnn_gwm')
  • RSGCN ('rsgcn_gwm')
  • GIN ('gin_gwm')

In the directory examples/molnet_wle, we have implemented the new preprocessing ''Weisfeiler-Lehman Embedding for Molecular Graph Neural Networks'' [26] for several GNN architectures. Please find the Readme in that directory for the usage and the details.

Supported Datasets

The following datasets are currently supported:

Chemical

  • QM9 [7, 8]
  • Tox21 [9]
  • MoleculeNet [11]
  • ZINC (only 250k dataset) [12, 13]
  • User (own) dataset

Network

  • cora [21]
  • citeseer [22]
  • reddit [23]

Research Projects

If you use Chainer Chemistry in your research, feel free to submit a pull request and add the name of your project to this list:

  • BayesGrad: Explaining Predictions of Graph Convolutional Networks (paper, code)
  • Graph Warp Module: an Auxiliary Module for Boosting the Power of Graph Neural Networks (paper, code)
  • GraphNVP: An Invertible Flow Model for Generating Molecular Graphs (paper, code)
  • Graph Residual Flow for Molecular Graph Generation (paper)

Useful Links

Chainer Chemistry:

Other Chainer frameworks:

License

This project is released under the MIT License. Please refer to the this page for more information.

Please note that Chainer Chemistry is still in experimental development. We continuously strive to improve its functionality and performance, but at this stage we cannot guarantee the reproducibility of any results published in papers. Use the library at your own risk.

References

[1] Seiya Tokui, Kenta Oono, Shohei Hido, and Justin Clayton. Chainer: a next-generation open source framework for deep learning. In Proceedings of Workshop on Machine Learning Systems (LearningSys) in Advances in Neural Information Processing System (NIPS) 28, 2015.

[2] David K Duvenaud, Dougal Maclaurin, Jorge Iparraguirre, Rafael Bombarell, Timothy Hirzel, Alan Aspuru-Guzik, and Ryan P Adams. Convolutional networks on graphs for learning molecular fingerprints. In C. Cortes, N. D. Lawrence, D. D. Lee, M. Sugiyama, and R. Garnett, editors, Advances in Neural Information Processing Systems (NIPS) 28, pages 2224–2232. Curran Asso- ciates, Inc., 2015.

[3] Justin Gilmer, Samuel S Schoenholz, Patrick F Riley, Oriol Vinyals, and George E Dahl. Neural message passing for quantum chemistry. arXiv preprint arXiv:1704.01212, 2017.

[4] Yujia Li, Daniel Tarlow, Marc Brockschmidt, and Richard Zemel. Gated graph sequence neural networks. arXiv preprint arXiv:1511.05493, 2015.

[5] Steven Kearnes, Kevin McCloskey, Marc Berndl, Vijay Pande, and Patrick Riley. Molecular graph convolutions: moving beyond fingerprints. Journal of computer-aided molecular design, 30(8):595–608, 2016.

[6] Kristof Schütt, Pieter-Jan Kindermans, Huziel Enoc Sauceda Felix, Stefan Chmiela, Alexandre Tkatchenko, and Klaus-Rober Müller. Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. In I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems (NIPS) 30, pages 992–1002. Curran Associates, Inc., 2017.

[7] Lars Ruddigkeit, Ruud Van Deursen, Lorenz C Blum, and Jean-Louis Reymond. Enumeration of 166 billion organic small molecules in the chemical universe database gdb-17. Journal of chemical information and modeling, 52(11):2864–2875, 2012.

[8] Raghunathan Ramakrishnan, Pavlo O Dral, Matthias Rupp, and O Anatole Von Lilienfeld. Quantum chemistry structures and properties of 134 kilo molecules. Scientific data, 1:140022, 2014.

[9] Ruili Huang, Menghang Xia, Dac-Trung Nguyen, Tongan Zhao, Srilatha Sakamuru, Jinghua Zhao, Sampada A Shahane, Anna Rossoshek, and Anton Simeonov. Tox21challenge to build predictive models of nuclear receptor and stress response pathways as mediated by exposure to environmental chemicals and drugs. Frontiers in Environmental Science, 3:85, 2016.

[10] Kipf, Thomas N. and Welling, Max. Semi-Supervised Classification with Graph Convolutional Networks. International Conference on Learning Representations (ICLR), 2017.

[11] Zhenqin Wu, Bharath Ramsundar, Evan N. Feinberg, Joseph Gomes, Caleb Geniesse, Aneesh S. Pappu, Karl Leswing, Vijay Pande, MoleculeNet: A Benchmark for Molecular Machine Learning, arXiv preprint, arXiv: 1703.00564, 2017.

[12] J. J. Irwin, T. Sterling, M. M. Mysinger, E. S. Bolstad, and R. G. Coleman. Zinc: a free tool to discover chemistry for biology. Journal of chemical information and modeling, 52(7):1757–1768, 2012.

[13] Preprocessed csv file downloaded from https://raw.githubusercontent.com/aspuru-guzik-group/chemical_vae/master/models/zinc_properties/250k_rndm_zinc_drugs_clean_3.csv

[14] Michael Schlichtkrull, Thomas N. Kipf, Peter Bloem, Rianne van den Berg, Ivan Titov, Max Welling. Modeling Relational Data with Graph Convolutional Networks. Extended Semantic Web Conference (ESWC), 2018.

[15] Veličković, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., & Bengio, Y. (2017). Graph Attention Networks. arXiv preprint arXiv:1710.10903.

[16] Dan Busbridge, Dane Sherburn, Pietro Cavallo and Nils Y. Hammerla. (2019). Relational Graph Attention Networks. https://openreview.net/forum?id=Bklzkh0qFm

[17] Keyulu Xu, Weihua Hu, Jure Leskovec, Stefanie Jegelka, ``How Powerful are Graph Neural Networks?'', arXiv:1810.00826 [cs.LG], 2018 (to appear at ICLR19).

[18] K. Ishiguro, S. Maeda, and M. Koyama, ``Graph Warp Module: an Auxiliary Module for Boosting the Power of Graph Neural Networks'', arXiv:1902.01020 [cs.LG], 2019.

[19] Oriol Vinyals, Samy Bengio, Manjunath Kudlur. Order Matters: Sequence to sequence for sets. arXiv preprint arXiv:1511.06391, 2015.

[20] Marc Brockschmidt, ``GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation'', arXiv:1906.12192 [cs.ML], 2019.

[21] McCallum, Andrew Kachites and Nigam, Kamal and Rennie, Jason and Seymore, Kristie, Automating the Construction of Internet Portals with Machine Learning. Information Retrieval, 2000.

[22] C. Lee Giles and Kurt D. Bollacker and Steve Lawrence, CiteSeer: An Automatic Citation Indexing System. Proceedings of the Third ACM Conference on Digital Libraries, 1998.

[23] William L. Hamilton and Zhitao Ying and Jure Leskovec, Inductive Representation Learning on Large Graphs. Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 4-9 December 2017

[24] Chi Chen, Weike Ye, Yunxing Zuo, Chen Zheng, and Shyue Ping Ong. Graph networks as a universal machine learning framework for molecules and crystals. Chemistry of Materials, 31(9):3564–3572, 2019.

[25] Tian Xie and Jeffrey C Grossman. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Physical review letters, 120(14):145301, 2018.

[26] Katsuhiko Ishiguro, Kenta Oono, and Kohei Hayashi, "Weisfeiler-Lehman Embedding for Molecular Graph Neural Networks", arXiv: 2006.06909, 2020. paper link

chainer-chemistry's People

Contributors

amaotone avatar anaruse avatar cks-coil avatar corochann avatar delta2323 avatar ir5 avatar k-ishiguro avatar k-ujihara avatar knshnb avatar ktns avatar lennmars avatar mihaimorariu avatar mist714 avatar mottodora avatar msakai avatar n-yoshikawa avatar natsukium avatar nissy-dev avatar taizoayase avatar yoshikawamasashi avatar zaltoprofen avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

chainer-chemistry's Issues

Unit test failed

_________________________________________ test_get_molnet_bbbp_dataset_with_smiles _________________________________________

    @pytest.mark.slow
    def test_get_molnet_bbbp_dataset_with_smiles():
        # test default behavior
        pp = AtomicNumberPreprocessor()
        datasets = molnet.get_molnet_dataset('bbbp', preprocessor=pp,
                                             return_smiles=True)
    
        assert 'smiles' in datasets.keys()
        assert 'dataset' in datasets.keys()
        smileses = datasets['smiles']
        datasets = datasets['dataset']
        assert len(smileses) == 3
        assert len(datasets) == 3
    
        # Test each train, valid and test dataset
        for i, dataset in enumerate(datasets):
            # --- Test dataset is correctly obtained ---
            index = np.random.choice(len(dataset), None)
            atoms, label = dataset[index]
    
            assert atoms.ndim == 1  # (atom, )
            assert atoms.dtype == np.int32
            # (atom from, atom to) or (edge_type, atom from, atom to) assert label.ndim == 1
            assert label.shape[0] == 1
            assert label.dtype == np.int32
>           assert len(dataset) == expect_bbbp_lengths[i]
E           assert 1630 == 1631
E            +  where 1630 = len(<chainer_chemistry.datasets.numpy_tuple_dataset.NumpyTupleDataset object at 0x2b5c61c03d30>)

datasets_tests/molnet_tests/test_molnet.py:81: AssertionError

Unittest in test_schnet.py often fails

test_backward in test_schnet.py checks gradients numerically calculated and one calculated with backpropagation. It often, not everytime, fails both in CPU and GPU.

Accuracy is not accuracy in QM9 example

main/accuracy and validation/main/accuracy in QM9 example are actually some kind of error, which should be minimized. It might be helpful if we rename this abs_error or something like that.

fail to import rdkit.Chem.rdmolops

I failed to import rdkit.Chem.rdmolops.
You can reproduce this issue by following step.

conda -n create test python=3.6 anaconda
source activate test
pip install chainer-chemistry
conda install -c rdkit rdkit
python
from rdkit.Chem.rdmolops import *

Error message is
module compiled against API version 0xb but this version of numpy is 0xa

I avoid this error by upgrading numpy, just executing pip install numpy --upgrade

Inference of Tox21 example fails in GPU mode

Code to reproduce

$ python train_tox21.py --method nfp --label NR-AR --conv-layers 1 --gpu 0 --epoch 1 --unit-num 10
$ python inference_tox21.py --in-dir result/ --gpu 0

Log

$ python inference_tox21.py --in-dir result/ --gpu 0

load from cache input/nfp_NR-AR
Use NFP predictor...
Traceback (most recent call last):
  File "inference_tox21.py", line 62, in <module>
    main()
  File "inference_tox21.py", line 58, in main
    y_pred = inference_loop.inference(test)
  File "/home/delta/dev/chainer-chemistry1/examples/tox21/predictor.py", line 137, in inference
    device=-1)
  File "/home/delta/dev/chainer-chemistry1/examples/tox21/predictor.py", line 107, in customized_inference
    y = self.predictor.predict(*x)
  File "/home/delta/dev/chainer-chemistry1/examples/tox21/predictor.py", line 68, in predict
    x = self.__call__(atoms, adjs)
  File "/home/delta/dev/chainer-chemistry1/examples/tox21/predictor.py", line 62, in __call__
    x = self.graph_conv(atoms, adjs)
  File "/home/delta/dev/chainer-chemistry1/chainerchem/models/nfp.py", line 144, in __call__
    h = self.embed(atom_array)
  File "/home/delta/dev/chainer-chemistry1/chainerchem/links/embed_atom_id.py", line 47, in __call__
    h = super(EmbedAtomID, self).__call__(x)
  File "/home/delta/.pyenv/versions/anaconda3-4.3.0/envs/anaconda3/lib/python3.6/site-packages/chainer/links/connection/embed_id.py", line 70, in __call__
    return embed_id.embed_id(x, self.W, ignore_label=self.ignore_label)
  File "/home/delta/.pyenv/versions/anaconda3-4.3.0/envs/anaconda3/lib/python3.6/site-packages/chainer/functions/connection/embed_id.py", line 170, in embed_id
    return EmbedIDFunction(ignore_label=ignore_label).apply((x, W))[0]
  File "/home/delta/.pyenv/versions/anaconda3-4.3.0/envs/anaconda3/lib/python3.6/site-packages/chainer/function_node.py", line 245, in apply
    outputs = self.forward(in_data)
  File "/home/delta/.pyenv/versions/anaconda3-4.3.0/envs/anaconda3/lib/python3.6/site-packages/chainer/functions/connection/embed_id.py", line 35, in forward
    .format(type(W), type(x)))
ValueError: numpy and cupy must not be used together
type(W): <class 'cupy.core.core.ndarray'>, type(x): <class 'numpy.ndarray'>

Error in QM9 example

Code to reproduce

$ cd examples
$ python tox21/train_tox21.py --method nfp --conv-layers 1 --gpu -1 --epoch 1 --unit-num 10
$ python qm9/train_qm9.py --method nfp --conv_layers 1 --gpu -1 --epoch 1 --unit_num 10

Error log (when running train_qm9.py)

$ python qm9/train_qm9.py --method nfp --conv_layers 1 --gpu -1 --epoch 1 --unit_num 10
load from cache input/nfp_all
preprocessing dataset...
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 133885/133885 [01:08<00:00, 1951.58it/s]
Traceback (most recent call last):
  File "qm9/train_qm9.py", line 208, in <module>
    main()
  File "qm9/train_qm9.py", line 127, in main
    os.makedirs(cache_dir)
  File "/Users/oonokenta/.pyenv/versions/anaconda3-4.3.0/envs/anaconda3/lib/python3.6/os.py", line 220, in makedirs
    mkdir(name, mode)
FileExistsError: [Errno 17] File exists: 'input/nfp_all'

Align the API of GraphConvPredictor in the Tox21 example to that of sklearn.

Currently, GraphConvPredictor.predict outputs an array of probabilities of being positive. But estimators in scikit-learn output labels use predict_proba for probabilities. In order for scikit-learn users to understand intuitively, we should align API to scikit-learn either by changing the API name to predict_proba or changing returned value to labels.

segmentation fault in example

I have followed commands introduced in README in order to run examples, but it failed in segmentation fault.

Here is my environment.
chainer version:3.5.0
OS:mac OS High Sierra version 10.13.4
Anaconda version:4.5.0

I created chainer-env via Anaconda following commands below.

$ conda create -c rdkit -n chainer-env rdkit
$ conda activate chainer-env
$ pip install chainer
$ conda install scikit-learn, pandas, tqdm
$ git clone https://github.com/pfnet-research/chainer-chemistry.git
$ pip install -e chainer-chemistry

Next I tried to run the example introduced in README.
But unfortunately these commands ended up with segmentation fault.

$ cd chainer-chemistry/examples/tox21
$ python train_tox21.py --method=nfp  --gpu=0 
zsh: segmentation fault  python train_tox21.py --method=nfp --gpu=0

I investigated this case and found the following might be the cause of this problem.

$ python
>>> import chainer
>>> import rdkit # this works
$ python
>>> import rdkit
>>> import chainer
segmentation fault

Say, import rdkit after import chainer works but the opposite does not.

Applying weavenet model

Hello,
prediction with weavenet behaves strangely - my dataset has 1028 rows, but when i run prediction, the output array has only 437 rows. When i try to predict at a reduced dataset, i get even more reduced output. Other models return same prediction length with that code.

What could be the problem?

RSGCN TODO list

Remaining task after
#89

[Required]

  • Add test code
    Both pytest and example.sh code
    -- Test predict code of tox21
  • Add the network and paper citation in README

[Nice to have]

  • Residual connection option:
    Original paper experiments with/with out residual connection in appendix.
    So option kwargs can be added on network to add residual connection or not.
  • Add network in QM9 example as well.
  • Support sparce matmul.
    They claim calculation can be faster with sparce matrix.
    https://github.com/tkipf/pygcn/blob/master/pygcn/layers.py#L9

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.