Giter VIP home page Giter VIP logo

dit's Introduction

dit is a Python package for information theory.

Continuous Integration Status Test Coverage Status Code Quality Requirements Status

Documentation Status dit chat Say Thanks! Conda installation

JOSS Status DOI

Try dit live: Run `dit` live!

Introduction

Information theory is a powerful extension to probability and statistics, quantifying dependencies among arbitrary random variables in a way that is consistent and comparable across systems and scales. Information theory was originally developed to quantify how quickly and reliably information could be transmitted across an arbitrary channel. The demands of modern, data-driven science have been coopting and extending these quantities and methods into unknown, multivariate settings where the interpretation and best practices are not known. For example, there are at least four reasonable multivariate generalizations of the mutual information, none of which inherit all the interpretations of the standard bivariate case. Which is best to use is context-dependent. dit implements a vast range of multivariate information measures in an effort to allow information practitioners to study how these various measures behave and interact in a variety of contexts. We hope that having all these measures and techniques implemented in one place will allow the development of robust techniques for the automated quantification of dependencies within a system and concrete interpretation of what those dependencies mean.

Citing

If you use dit in your research, please cite it as:

@article{dit,
  Author = {James, R. G. and Ellison, C. J. and Crutchfield, J. P.},
  Title = {{dit}: a {P}ython package for discrete information theory},
  Journal = {The Journal of Open Source Software},
  Volume = {3},
  Number = {25},
  Pages = {738},
  Year = {2018},
  Doi = {https://doi.org/10.21105/joss.00738}
}

Basic Information

Documentation

http://docs.dit.io

Downloads

https://pypi.org/project/dit/

https://anaconda.org/conda-forge/dit

Dependencies

Optional Dependencies

  • colorama: colored column heads in PID indicating failure modes
  • cython: faster sampling from distributions
  • hypothesis: random sampling of distributions
  • matplotlib, python-ternary: plotting of various information-theoretic expansions
  • numdifftools: numerical evaluation of gradients and hessians during optimization
  • pint: add units to informational values
  • scikit-learn: faster nearest-neighbor lookups during entropy/mutual information estimation from samples

Install

The easiest way to install is:

pip install dit

If you want to install dit within a conda environment, you can simply do:

conda install -c conda-forge dit

Alternatively, you can clone this repository, move into the newly created dit directory, and then install the package:

git clone https://github.com/dit/dit.git
cd dit
pip install .

Note

The cython extensions are currently not supported on windows. Please install using the --nocython option.

Testing

$ git clone https://github.com/dit/dit.git
$ cd dit
$ pip install -r requirements_testing.txt
$ py.test

Code and bug tracker

https://github.com/dit/dit

License

BSD 3-Clause, see LICENSE.txt for details.

Implemented Measures

dit implements the following information measures. Most of these are implemented in multivariate & conditional generality, where such generalizations either exist in the literature or are relatively obvious --- for example, though it is not in the literature, the multivariate conditional exact common information is implemented here.

Entropies

  • Shannon Entropy
  • Renyi Entropy
  • Tsallis Entropy
  • Necessary Conditional Entropy
  • Residual Entropy / Independent Information / Variation of Information

Mutual Informations

  • Co-Information
  • Interaction Information
  • Total Correlation / Multi-Information
  • Dual Total Correlation / Binding Information
  • CAEKL Multivariate Mutual Information

Divergences

  • Variational Distance
  • Kullback-Leibler Divergence Relative Entropy
  • Cross Entropy
  • Jensen-Shannon Divergence
  • Earth Mover's Distance

Other Measures

  • Channel Capacity
  • Complexity Profile
  • Connected Informations
  • Copy Mutual Information
  • Cumulative Residual Entropy
  • Extropy
  • Hypercontractivity Coefficient
  • Information Bottleneck
  • Information Diagrams
  • Information Trimming
  • Lautum Information
  • LMPR Complexity
  • Marginal Utility of Information
  • Maximum Correlation
  • Maximum Entropy Distributions
  • Perplexity
  • Rate-Distortion Theory
  • TSE Complexity

Common Informations

  • Gacs-Korner Common Information
  • Wyner Common Information
  • Exact Common Information
  • Functional Common Information
  • MSS Common Information

Partial Information Decomposition

  • I_{min}
  • I_{\wedge}
  • I_{RR}
  • I_{\downarrow}
  • I_{proj}
  • I_{BROJA}
  • I_{ccs}
  • I_{\pm}
  • I_{dep}
  • I_{RAV}
  • I_{mmi}
  • I_{\prec}
  • I_{RA}
  • I_{SKAR}
  • I_{IG}
  • I_{RDR}

Secret Key Agreement Bounds

  • Secrecy Capacity
  • Intrinsic Mutual Information
  • Reduced Intrinsic Mutual Information
  • Minimal Intrinsic Mutual Information
  • Necessary Intrinsic Mutual Information
  • Two-Part Intrinsic Mutual Information

Quickstart

The basic usage of dit corresponds to creating distributions, modifying them if need be, and then computing properties of those distributions. First, we import:

>>> import dit

Suppose we have a really thick coin, one so thick that there is a reasonable chance of it landing on its edge. Here is how we might represent the coin in dit.

>>> d = dit.Distribution(['H', 'T', 'E'], [.4, .4, .2])
>>> print(d)
Class:          Distribution
Alphabet:       ('E', 'H', 'T') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 1
RV Names:       None

x   p(x)
E   0.2
H   0.4
T   0.4

Calculate the probability of H and also of the combination H or T.

>>> d['H']
0.4
>>> d.event_probability(['H','T'])
0.8

Calculate the Shannon entropy and extropy of the joint distribution.

>>> dit.shannon.entropy(d)
1.5219280948873621
>>> dit.other.extropy(d)
1.1419011889093373

Create a distribution where Z = xor(X, Y).

>>> import dit.example_dists
>>> d = dit.example_dists.Xor()
>>> d.set_rv_names(['X', 'Y', 'Z'])
>>> print(d)
Class:          Distribution
Alphabet:       ('0', '1') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 3
RV Names:       ('X', 'Y', 'Z')

x     p(x)
000   0.25
011   0.25
101   0.25
110   0.25

Calculate the Shannon mutual informations I[X:Z], I[Y:Z], and I[X,Y:Z].

>>> dit.shannon.mutual_information(d, ['X'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['Y'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['X', 'Y'], ['Z'])
1.0

Calculate the marginal distribution P(X,Z). Then print its probabilities as fractions, showing the mask.

>>> d2 = d.marginal(['X', 'Z'])
>>> print(d2.to_string(show_mask=True, exact=True))
Class:          Distribution
Alphabet:       ('0', '1') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 2 (mask: 3)
RV Names:       ('X', 'Z')

x     p(x)
0*0   1/4
0*1   1/4
1*0   1/4
1*1   1/4

Convert the distribution probabilities to log (base 3.5) probabilities, and access its probability mass function.

>>> d2.set_base(3.5)
>>> d2.pmf
array([-1.10658951, -1.10658951, -1.10658951, -1.10658951])

Draw 5 random samples from this distribution.

>>> dit.math.prng.seed(1)
>>> d2.rand(5)
['01', '10', '00', '01', '00']

Contributions & Help

If you'd like to feature added to dit, please file an issue. Or, better yet, open a pull request. Ideally, all code should be tested and documented, but please don't let this be a barrier to contributing. We'll work with you to ensure that all pull requests are in a mergable state.

If you'd like to get in contact about anything, you can reach us through our slack channel.

dit's People

Contributors

autoplectic avatar chebee7i avatar marcharper avatar artemyk avatar stsievert avatar tobmag avatar handcartcactus avatar robince avatar agrif avatar volpatto avatar feeds avatar haraldschilly avatar marwahaha avatar waffle-iron avatar p16i avatar jemenheiser avatar kokokostation avatar zhaofeng-shu33 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.