Giter VIP home page Giter VIP logo

neuraldnf's Introduction

Neural#DNF

This is the code repository for the neural model counting system Neural#DNF, presented in the AAAI-20 paper "Learning to Reason: Leveraging Neural Networks for Approximate Weighted Model Counting". It contains code for training and building the graph neural network, generating random DNF formulas with desired properties, and evaluating the network as described in the paper. It also contains the final training weights in the netParams_2 subdirectory.

Requirements

  • TensorFlow >=1.12 and the corresponding NumPy version
  • JobLib >= 0.12.0 for data generation
  • SciPy
  • MatPlotLib for figure reproduction

Datasets

The paper's training and test data sets are available for download in ZIP format here.

Running Neural#DNF

Overall Experiments

To run overall experiments on a test dataset with thresholds 0.01, 0.02, 0.05, 0.1, 0.15, and 0.2, run

python runExperiments.py "../Data/TestSetDirectory/"

Additional options include retrieving results by clause width, using -widthBasedAnalysis T, and changing the number of message passing iterations using -numIter X. A comprehensive list of options can be viewed using the command

python runExperiments.py -h

Experiments vs Formula Number of Variables

This test setting runs analogously to the earlier case, but instead reports performance versus number of formula variables n. This test can be run (additional options: -h) using the command:

python runExperimentsBySize.py "../Data/TestSetDirectory/"

Generating labelled DNF formulas

To generate data according to the paper's standard proportions, run

python generateData.py

Additional options for generation are also provided, and a list of these can be found using the -h flag. Note: Generation proportions can be changed by altering the three proportions arrays in the generateData.py script (dFS (distinct File Sizes), nCD (number (of) Clauses Distribution) and cWD (clause Width Distribution)).

Training Neural#DNF

A single epoch of training on a training set can be run using the command

python Train.py "../Data/TrainingSet/"

The number of epochs can be changed using the -nbTrainingEpochs X flag, and further options can be found using -h

Referencing this paper

If you make use of this code, or its accompanying paper, please cite this work as follows:

@inproceedings{ACL-AAAI20,
  title={Learning to {R}eason: Leveraging Neural Networks for Approximate {DNF} Counting},
  author    = {Ralph Abboud and
                {\.I}smail {\.I}lkan Ceylan and
               Thomas Lukasiewicz},
  booktitle={Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence ({AAAI})},
  year={2020}
}

neuraldnf's People

Contributors

ralphabb avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.