Giter VIP home page Giter VIP logo

jlizier / jidt Goto Github PK

View Code? Open in Web Editor NEW
253.0 22.0 72.0 49.78 MB

JIDT: Java Information Dynamics Toolkit for studying information-theoretic measures of computation in complex systems

Home Page: http://jlizier.github.io/jidt/

License: GNU General Public License v3.0

Java 67.52% MATLAB 0.11% Makefile 0.26% Shell 0.01% C 0.85% Cuda 30.13% C++ 1.12%
information-theory transfer-entropy entropy mutual-information conditional-mutual-information conditional-transfer-entropy java python matlab octave

jidt's Introduction

Java Information Dynamics Toolkit (JIDT)

Copyright (C) 2012- Joseph T. Lizier; 2014- Ipek Özdemir; 2017- Pedro Mediano; 2019- Emanuele Crosato, Sooraj Sekhar, Oscar Huaigu Xu; 2022- David Shorten

JIDT provides a stand-alone, open-source code Java implementation (also usable in Matlab, Octave, Python, R, Julia and Clojure) of information-theoretic measures of distributed computation in complex systems: i.e. information storage, transfer and modification.

JIDT includes implementations:

  • principally for the measures transfer entropy, mutual information, and their conditional variants, as well as active information storage, entropy, etc;
  • for both discrete and continuous-valued data;
  • using various types of estimators (e.g. Kraskov-Stögbauer-Grassberger estimators, box-kernel estimation, linear-Gaussian), as described in full at ImplementedMeasures.

JIDT is easy to use:

  • It ships with a GUI application -- the AutoAnalyser, see picture below -- to facilitate point-and-click analysis, as well as code template generation for more complex analysis.
  • We provide short video lectures and corresponding slides in a (beta) Course on how to understand using information-theoretic tools to analyse complex systems, and to implement such analysis with JIDT.

JIDT is distributed under the GNU GPL v3 license (or later).

Getting started

  1. Download and Installation is very easy!
    1. Quick start: take a git clone (then build via AntScripts) OR download the latest v1.6.1 full distribution (suitable for all platforms) and see the readme.txt file therein.
  2. Documentation including: the paper describing JIDT at arXiv:1408.3270 (distributed with the toolkit), a (beta) Course including short video lectures and a shorter Tutorial, and Javadocs (v1.6.1 here);
  3. Demos are included with the full distribution, including a GUI app for automatic analysis and code generation (see picture below), simple java demos and cellular automata (CA) demos.
    1. These Java tools can easily be used in Matlab/Octave, Python, R, Julia and Clojure! (click on each language here for examples)

Computing in the GUI app image

Course and video lectures

For further information or announcements:

Citation

Please cite your use of this toolkit as:

Joseph T. Lizier, "JIDT: An information-theoretic toolkit for studying the dynamics of complex systems", Frontiers in Robotics and AI 1:11, 2014; doi:10.3389/frobt.2014.00011 (pre-print: arXiv:1408.3270)

And please let me know about any publications resulting from its use!

See other PublicationsUsingThisToolkit.

News

22/08/2023 - New full distribution files available for release v1.6.1; Changes for v1.6.1 include: Minor updates to supporting use in Python, including virtual environments; Minor tweaks to fish schooling examples (mostly comments).

5/09/2022 - New full distribution files available for release v1.6; Changes for v1.6 include: Adding Flocking/Schooling/Swarming demo; Included Pedro's code on IIT and O-/S-Information measures; Spiking TE estimator added from David; Fixed up AutoAnalyser to work well for Python3 and numpy; Links to lecture videos included in the beta wiki for the course; Added rudimentary effective network inference (simplified version of the IDTxl full algorithm) in demos/octave/EffectiveNetworkInference;

26/11/2018 - New jar and full distribution files available for release v1.5; Changes for v1.5 include: Added GPU (cuda) capability for KSG Conditional Mutual Information calculator (proper documentation to come), brief wiki page and unit tests included; Added auto-embedding for TE/AIS with multivariate KSG, and univariate and multivariate Gaussian estimator (plus unit tests), for Ragwitz criteria and Maximum bias-corrected AIS, and also added Maximum bias corrected AIS and TE to handle source embedding as well; Kozachenko entropy estimator adds noise to data by default; Added bias-correction property to Gaussian and Kernel estimators for MI and conditional MI, including with surrogates (only option for kernel); Enabled use of different bases for different variables in MI discrete estimator; All new above features enabled in AutoAnalyser; Added drop-down menus for parameters in AutoAnalyser; Included long-form lecture slides in course folder;

26/11/2017 - New jar and full distribution files available for release v1.4; Changes for v1.4 include: Major expansion of functionality for AutoAnalysers: adding Launcher applet and capability to double click jar to launch, added Entropy, CMI, CTE and AIS AutoAnalysers, also added binned estimator type, added all variables/pairs analysis, added statistical significance analysis, and ensured functionality of generated Python code with Python3; Added GPU (cuda) capability for KSG Mutual Information calculator (proper documentation and wiki page to come), including unit tests; Added fast neighbour search implementations for mixed discrete-continuous KSG MI estimators; Expanded Gaussian estimator for multi-information (integration); Made all demo/data files readable by Matlab.

17/12/2016 - New book out from J. Lizier et al., "An Introduction to Transfer Entropy: Information Flow in Complex Systems" published by Springer, which contains various examples using JIDT (distributed in our releases)

21/10/2016 - New jar and full distribution files available for release v1.3.1; Changes for v1.3.1 include: Major update to TransferEntropyCalculatorDiscrete so as to implement arbitrary source and dest embeddings and source-dest delay; Conditional TE calculators (continuous) handle empty conditional variables; Added new auto-embedding method for AIS and TE which maximises bias corrected AIS; Added getNumSeparateObservations() method to TE calculators to make reconstructing/separating local values easier after multiple addObservations() calls; Fixed kernel estimator classes to return proper densities, not probabilities; Bug fix in mixed discrete-continuous MI (Kraskov) implementation; Added simple interface for adding joint observations for MultiInfoCalculatorDiscrete Including compiled class files for the AutoAnalyser demo in distribution; Updated Python demo 1 to show use of numpy arrays with ints; Added Python demo 7 and 9 for TE Kraskov with ensemble method and auto-embedding respectively; Added Matlab/Octave example 10 for conditional TE via Kraskov (KSG) algorithm; Added utilities to prepare for enhancing surrogate calculations with fast nearest neighbour search; Minor bug patch to Python readFloatsFile utility.

19/7/2015 - New jar and full distribution files available for release v1.3; Changes for v1.3 include: Added AutoAnalyser (Code Generator) GUI demo for MI and TE; Added auto-embedding capability via Ragwitz criteria for AIS and TE calculators (KSG estimators); Added Java demo 9 for showcasing use of Ragwitz auto-embedding; Adding small amount of noise to data in all KSG estimators now by default (may be disabled via setProperty()); Added getProperty() methods for all conditional MI and TE calculators; Upgraded Python demos for Python 3 compatibility; Fixed bias correction on mixed discrete-continuous KSG calculators; Updated the tutorial slides to those in use for ECAL 2015 JIDT tutorial.

12/2/2015 - New jar and full distribution files available for release v1.2.1; Changes for v1.2.1 include: Added tutorial slides, description of exercises and sample exercise solutions; Made jar target Java 1.6; Added Schreiber TE heart-breath rate with KSG estimator demo code for Python.

28/1/2015 - New jar and full distribution files available for release v1.2; Changes for v1.2 include: Dynamic correlation exclusion, or Theiler window, added to all Kraskov estimators; Added univariate MI calculation to simple demo 6; Added Java code for Schreiber TE heart-breath rate with KSG estimator, ready for use as a template in Tutorial; Patch for crashes in KSG conditional MI algorithm 2.

20/11/2014 - New jar and full distribution files available for release v1.1; Changes for v1.1 include: Implemented Fast Nearest Neighbour Search for Kraskov-Stögbauer-Grassberger (KSG) estimators for MI, conditional MI, TE, conditional TE, AIS, Predictive info, and multi-information. This includes a general (multivariate) k-d tree implementation; Added multi-threading (using all available processors by default) for the KSG estimators -- code contributed by Ipek Özdemir; Added Predictive information / Excess entropy implementations for KSG, kernel and Gaussian estimators; Added R, Julia, and Clojure demos; Added Windows batch files for the Simple Java Demos; Added property for adding a small amount of noise to data in all KSG estimators;

15/8/2014 JIDT paper finalised and uploaded to the website and arXiv:1408.3270

14/8/2014 - New jar and full distribution files available for our first official release, v1.0; Changes for v1.0 include: Added the draft of the paper on the toolkit to the release; Javadocs made ready for release; Switched source->destination arguments for discrete TE calculators to be with source first in line with continuous calculators; Renamed all discrete calculators to have Discrete suffix -- TE and conditional TE calculators also renamed to remove "Apparent" prefix and change "Complete" to "Conditional"; Kraskov estimators now using 4 nearest neighbours by default; Unit test for Gaussian TE against ChaLearn Granger causality measurement; Added Schreiber TE demos; Interregional transfer demos; documentation for Interaction lag demos; added examples 7 and 8 to Simple Java demos; Added property to add noise to data for Kraskov MI; Added derivation of Apache Commons Math code for chi square distribution, and included relevant notices in our release; Inserted translation class for arrays between Octave and Java; Added analytic statistical significance calculation to Gaussian calculators and discrete TE; Corrected Kraskov algorithm 2 for conditional MI to follow equation in Wibral et al. 2014.

20/4/2014 - New jar and full distribution files available for v0.2.0; Moved downloads to http://lizier.me/joseph/ since google code has stopped the download facility here :(. Changes for v0.2.0 include: Rearchitected (most) Transfer Entropy and Multivariate TE calculators to use an underlying conditional mutual information calculator, and have arbitrary embedding delay, source-dest delay; this includes moving Kraskov-Grassberger Transfer Entropy calculator to use a single conditional mutual information estimator instead of two mutual information estimators; Rearchitected (most) Active Information Storage calculators to use an underlying mutual information calculator; Added Conditional Transfer Entropy calculators using underlying conditional mutual information calculators; Moved mixed discrete-continuous calculators to a new "mixed" package; bug fixes.

11/9/2013 - New jar and full distribution files available for v0.1.4; added scripts to generate CA figures for 2013 book chapters; added general Java demo code; added Python demo code; made Octave/Matlab demos and CA demos properly compatible for Matlab; added extra Octave/Matlab general demos; added more unit tests for MI and conditional MI calculators, including against results from Wibral's TRENTOOL; bug fixes.

11/9/2013 - New CA demo scripts for several review book chapters we're preparing in 2013 have been uploaded - see CellularAutomataDemos.

4/6/2013 - Added instructions on how to use in python and several PythonExamples.

13/01/2013 - New jar and full distribution files available for v0.1.3; existing Octave/Matlab demo code made compatible with Matlab; several bug fixes, including using max norm by default in Kraskov calculator (instead of requiring this to be set explicitly); more unit tests (including against results from Kraskov's own MI implementation)

19/11/2012 - New jar and full distribution files available for v0.1.2, including demo code for two newly submitted papers

31/10/2012 - Jar and full distribution files available for v0.1.1 (first distribution)

7/5/2012 - JIDT project created and code uploaded

Acknowledgements

This project has been supported by funding through:

  • Australian Research Council Discovery Early Career Researcher Award (DECRA) "Relating function of complex networks to structure using information theory", J.T. Lizier, 2016-19 DE160100630
  • Universities Australia - Deutscher Akademischer Austauschdienst (German Academic Exchange Service) UA-DAAD Australia-Germany Joint Research Co-operation grant "Measuring neural information synthesis and its impairment", Wibral, Lizier, Priesemann, Wollstadt, Finn, 2016-17
  • University of Sydney Research Accelerator (SOAR) Fellowship 2019 Scheme, J.T. Lizier (CI), 2019-2020
  • Australian Research Council Discovery Project "Large-scale computational modelling of epidemics in Australia: analysis, prediction and mitigation", M. Prokopenko, P. Pattison, M. Gambhir, J.T. Lizier, M. Piraveenan, 2016-19 DP160102742

jidt's People

Contributors

dpshorten avatar jlizier avatar pmediano avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

jidt's Issues

Alter MutualInfoCalculatorMultiVariate significance calculations to specify which variable should be reordered

The ChannelCalculators talk about reordering the source variable to evaluate 
the statistical significance, but the 
MutualInfoCalculatorMultiVariate.computeAverageLocalOfObservations(int[] 
newOrdering) method reorders the second variable (which is now the 
destination). This is because it used to have source variable as the second 
argument.

Should reorganise this code so that it follows a consistent interpretation 
(reordering the source variable - the first variable). This will necessitate 
shuffling the arguments around in the Kraskov calculators. They are currently 
consistent (MI gives right result, TE calculator uses the variables in the 
correct order) but I want this finalised.



Original issue reported on code.google.com by joseph.lizier on 6 Aug 2012 at 6:38

Add bias correction for local MI/TE values

Add these as an option that can be turned on in properties.

See description by Turney and Pantel (2010), which is cited in our 2014 
Frontiers paper on Local Active Info Storage


Original issue reported on code.google.com by joseph.lizier on 28 Jan 2014 at 5:02

Add NetLogo example

Add an example which calls the jar from NetLogo (I think this is possible)

Original issue reported on code.google.com by joseph.lizier on 16 Apr 2014 at 12:54

Implement GPU code for Kraskov nearest neighbour search

Some tips on using aparapi for java here (suggested by Michael Wibral):

http://blogs.amd.com/developer/2011/09/14/i-dont-always-write-gpu-code-in-java-b
ut-when-i-do-i-like-to-use-aparapi/

Original issue reported on code.google.com by joseph.lizier on 7 Apr 2013 at 11:08

Switch kernel TE estimator to compute via a conditional MI

This will require constructing a kernel conditional MI calculator -- this 
should largely be done from the multivariate TE kernel.
This will allow other parameters to be set, e.g. source history length.

Original issue reported on code.google.com by joseph.lizier on 29 Sep 2014 at 5:18

Replace org.octave.Matrix calls with our own class

Conversion between native octave types and java types is not always working 
correctly in octaveToJavaDoubleArray, octaveToJavaDoubleMatrix, 
octaveToJavaIntArray and octaveToJavaIntMatrix, as identified by Christoph 
Hartmann:

"The bug seems to be in the octaveToJavaDoubleMatrix.m. Whenever the Java 
Matrix is created there with javaObject('org.octave.Matrix',...), it sometimes 
returns an int matrix and sometimes a double matrix. Whenever an int-matrix is 
returned, the .asDoubleMatrix() returns null, which leads to the NullPointer 
Exception. I suspect that this happens because the int-constructor is called 
instead of the double-constructor [2], but different forms of casting the 
values to double or the dimensions to int did not help."

The issue appears to be that octave is somehow not identifying the correct 
method signature in org.octave.Matrix. This bug is repeatable in 
octave/example6, which should load an array of doubles from a file, but somehow 
these are read into octave as integers. It is unclear why.

Solution appears to be to make the typing explicit in the method signatures, 
rather than to let octave decide it for us,

Original issue reported on code.google.com by joseph.lizier on 29 Apr 2014 at 1:32

KSG TE calculator should work as MI for history k = 0

What steps will reproduce the problem?
1. Set k_HISTORY property to 0.
2. Compute

This should function but instead crashes.

Can be fixed by implementing an underlying MI calculation if k = 0 is to be 
used. Could change this in the underlying conditional MI calculator in fact.

Original issue reported on code.google.com by joseph.lizier on 9 Jul 2015 at 12:40

Delete kraskov.TransferEntropyCalculatorKraskovByMulti

Can only do this once the multivariate TE classes have been replaced by one 
directly using a conditional TE calculator (since they depend on this class at 
the moment).

Original issue reported on code.google.com by joseph.lizier on 7 Apr 2014 at 4:12

Add a new demo with a GUI to take a data set and compute TE

Add a GUI -- possibly in python -- to take a data set, tell it which columns to 
compute TE between, select calculator, supply parameters, and compute TE or 
compute TE for a range of parameters and plot them

Original issue reported on code.google.com by joseph.lizier on 12 Feb 2015 at 5:32

Add random noise to data in Kraskov calculators

Large numbers of identical data cause problems for the Kraskov estimator (since 
they violate its underlying assumptions).
Kraskov's toolkit MILCA adds in a small amount of random noise to address this.

We should add a property to do this to all Kraskov calculators.
This has been done for the Kraskov MI calculator in r305, but we still need to 
add it to conditional MI calculators and check that it works ok for TE and AIS.

Original issue reported on code.google.com by joseph.lizier on 30 Apr 2014 at 2:16

DetectingInteractionLags/transferWithSourceMemory.m is broken

Figure 2 plot -- the empirical results no longer match the analytic -- I seem 
to have broken something in the conditional TE discrete calculator.

Need to fix this before the v1.1 release.

Original issue reported on code.google.com by joseph.lizier on 20 Nov 2014 at 12:24

Kraskov calculators to ensure they use actual k in joint space than assumed k

At the moment, our Kraskov calculators assume that they have k values within 
epsilon in the full joint space. Where the distribution is composed of delta 
functions (i.e. somewhat discretised) this won't be the case - there could be 
many more than k. As such, it may be more correct to use the actual k for each 
observation in contributing digamma(k) - 1/k, then averaging, rather than 
assume the parameter k.
This may add significantly to the runtime however so we could either:
- just issue a warning were this to occur, or
- track the actual k values in an array, then compute digamma(k) - 1/k once for 
each distinct value and add these in to the result.

Original issue reported on code.google.com by joseph.lizier on 7 Sep 2012 at 5:47

Analytic Chi Square distribution have wrong "actual value"

The actualValue field is populated by the computed info theoretic value * 
observations, which is fine for a chiSquare distribution, but is not correct 
for TE or MI clients. (Should be just the computed info theoretic value)

Original issue reported on code.google.com by joseph.lizier on 28 Apr 2014 at 6:43

Add dynamic exclusion time / Theiler window to Kraskov calculators

This can easily be done for single-time series estimations, by changing the 
code which avoids the same time step to avoid multiple time steps around that.
Presumably we should actually alter the psi(N) here too I think, since we don't 
have a full set of N points going into every neighbourhood count then.

To implement this for the ensemble method would be tricky, and I don't think we 
should try this in the first pass.

Original issue reported on code.google.com by joseph.lizier on 31 Oct 2014 at 3:54

Kernel estimator to prob density function properly

the kernel estimator at the moment is effectively using a probability distribution function rather than density function. This is because it is not correcting for the kernel width. The probability value returned should be divided by the kernel width so that it properly represents a density, and the entropies calculated will then be differential entropies.

Add normalise option to MultiInfoCalculatorKraskov

The normalise option exists in MutualInfoCalculatorMultiVariateKraskov and 
others, but for some reason I didn't put it here.
It should be there for the Kraskov technique to work as documented.

Original issue reported on code.google.com by joseph.lizier on 20 Dec 2012 at 4:21

Think harder about the Exception handling design

Currently we just throw vanilla Exceptions for everything.
1. Should we have specific Exception types?
2. Should we think about whether just throwing them or doing something 
different is the best strategy?

Original issue reported on code.google.com by joseph.lizier on 17 Aug 2012 at 6:21

KSG calculators to add noise in by default

I had left noise addition out by default so that results were repeatable. But 
with a small value of noise this should still be ok. Adding noise will stop us 
from getting crazy errors when systems converge to fixed attractors and we get 
lots of the same data points coming in. This has happened with Rose I think in 
2014, and on email to Sussex in 2015.
Consider this to make sure I'm not missing any good reason not to do it; then 
do it.

Original issue reported on code.google.com by joseph.lizier on 14 Apr 2015 at 4:10

Linear Gaussian TE estimator should probe Exception from CholeskyDecomposition

Following discussion between Wibral, Priesemann and Lizier today, we realised 
that it would be more appropriate for the TE calculator to return specific 
results in different cases where an exception is raised by the Cholesky 
Decomposition (non-positive definite matrices).
These cases are:
1. Non-positive definite covariance matrix between dest past and source -> 
return zero
2. Non-positive definite covariance matrix between dest past and next value -> 
return zero
3. Non-positive definite between all values: a. if others were positive 
definite -> return infinity, else should not come to this point.



Original issue reported on code.google.com by [email protected] on 5 Jul 2013 at 5:31

Kraskov Mutual info multivariate local MI did not average to MI for large time series

Reported by Ipek Özdemir in distribution 0.1.3 for 
infodynamics.measures.continuous.kraskov.MutualInfoCalculatorMultiVariateKraskov
 identified in unit test 
infodynamics.measures.continuous.kraskov.MutualInfoMultiVariateTester.testLocals
AverageCorrectly with a different number of time steps (10000).

This has been already fixed in later releases.


Original issue reported on code.google.com by joseph.lizier on 31 Jan 2014 at 10:57

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.