Giter VIP home page Giter VIP logo

ltu-ili's Introduction

LtU-ILI

All Contributors unittest codecov docs

The Learning the Universe Implicit Likelihood Inference (LtU-ILI) pipeline is an all-in-one framework for performing machine learning parameter inference in astrophysics and cosmology. Given labeled training data ${(x_i,\theta_i)}_{i=1}^N$ or a stochastic simulator $x(\theta)$, LtU-ILI is designed to automatically train state-of-the-art neural networks to learn the data-parameter relationship and produce robust, well-calibrated posterior inference.

The pipeline is quick and easy to set up; here's an example of training a Masked Autoregressive Flow (MAF) network to predict a posterior over parameters $y$, given input data $x$:

...  # Imports

X, Y = load_data()                              # Load training data and parameters
loader = ili.data.NumpyLoader(X, Y)             # Create a data loader

trainer = ili.inference.InferenceRunner.load(
  backend = 'sbi', engine='NPE',                # Choose a backend and inference engine (here, Neural Posterior Estimation)
  prior = ili.utils.Uniform(low=-1, high=1),    # Define a prior 
  # Define a neural network architecture (here, MAF)
  nets = [ili.utils.load_nde_sbi(engine='NPE', model='maf')]  
)

posterior, _ = trainer(loader)                  # Run training to map data -> parameters

samples = posterior.sample(                     # Generate 1000 samples from the posterior for input x[0]
  x=X[0], sample_shape=(1000,)
)

Beyond this simple example, LtU-ILI comes with a wide range of customizable complexity, including:

  • Posterior-, Likelihood-, and Ratio-Estimation methods for ILI, including Sequential learning analogs
  • Various neural density estimators (Mixture Density Networks, Conditional Normalizing Flows, ResNet-like ratio classifiers)
  • Fully-customizable, exotic embedding networks (including CNNs and Graph Neural Networks)
  • A unified interface for multiple ILI backends (sbi, pydelfi, and lampe)
  • Multiple marginal and multivariate posterior coverage metrics
  • Jupyter and command-line interfaces
  • A parallelizable configuration framework for efficient hyperparameter tuning and production runs

For more details on the motivation, design, and theoretical background of this project, see the software release paper (arxiv:2402.05137).

Getting Started

To install LtU-ILI, follow the instructions in INSTALL.md.

To get started, try out the tutorial for the Jupyter notebook interface in notebooks/tutorial.ipynb or the command line interface in examples/.

API Documentation

The documentation for this project can be found at this link.

References

We keep an updated repository of relevant interesting papers and resources at this link.

Contributing

Before contributing, please familiarize yourself with the contribution workflow described in CONTRIBUTING.md.

Contact

If you have comments, questions, or feedback, please write us an issue. The current leads of the Learning the Universe ILI working group are Benjamin Wandelt ([email protected]) and Matthew Ho ([email protected]).

Contributors

Matt Ho
Matt Ho

๐Ÿ’ป ๐ŸŽจ ๐Ÿ’ก ๐Ÿ“– ๐Ÿ‘€ ๐Ÿš‡ ๐Ÿ–‹ ๐Ÿ”ฌ
Deaglan Bartlett
Deaglan Bartlett

๐Ÿ’ป ๐ŸŽจ ๐Ÿ’ก ๐Ÿ“– ๐Ÿ‘€ ๐Ÿš‡ ๐Ÿ–‹ ๐Ÿ”ฌ
Nicolas Chartier
Nicolas Chartier

๐Ÿ’ก ๐Ÿ“– ๐Ÿ”ฌ ๐Ÿ’ป ๐ŸŽจ ๐Ÿ‘€ ๐Ÿ–‹
Carolina Cuesta
Carolina Cuesta

๐Ÿ’ป ๐ŸŽจ ๐Ÿ’ก ๐Ÿ“– ๐Ÿ‘€ ๐Ÿ”ฌ
Simon
Simon

๐Ÿ’ป ๐Ÿ’ก
Axel Lapel
Axel Lapel

๐Ÿ’ป ๐Ÿ”ฌ ๐Ÿ’ก
Pablo Lemos
Pablo Lemos

๐ŸŽจ ๐Ÿ’ป
Chris Lovell
Chris Lovell

๐Ÿ”ฌ ๐Ÿ’ก ๐Ÿ”ฃ ๐Ÿ–‹
T. Lucas Makinen
T. Lucas Makinen

๐Ÿ’ป ๐Ÿ”ฌ
Chirag Modi
Chirag Modi

๐ŸŽจ ๐Ÿ’ป
Shivam Pandey
Shivam Pandey

๐Ÿ”ฌ ๐Ÿ’ก
L.A. Perez
L.A. Perez

๐Ÿ”ฌ ๐Ÿ–‹

Acknowledgements

This work is supported by the Simons Foundation through the Simons Collaboration on Learning the Universe.

ltu-ili's People

Contributors

maho3 avatar deaglanbartlett avatar florpi avatar compiledatbirth avatar allcontributors[bot] avatar asiantaco avatar pablo-lemos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.