Giter VIP home page Giter VIP logo

mogpe's Introduction

mogpe - Mixtures of Gaussian Process Experts in TensorFlow

Documentation

Disclaimer: This is research code that I have rewritten/documented/packaged as a learning experience (it currently has no tests).

This package implements a Mixtures of Gaussian Process Experts (MoGPE) model with a GP-based gating network. Inference exploits factorization through sparse GPs and trains a variational lower bound stochastically. It also provides the building blocks for implementing other Mixtures of Gaussian Process Experts models. mogpe uses GPflow 2.1/TensorFlow 2.4+ for running computations, which allows fast execution on GPUs, and uses Python ≥ 3.8. It was originally created by Aidan Scannell.

Install mogpe

This is a Python package that should be installed into a virtual environment. Start by cloning the repo from Github:

git clone https://github.com/aidanscannell/mogpe.git

The package can then be installed into a virutal environment by adding it as a local dependency.

Install with Poetry

mogpe’s dependencies and packaging are being managed with Poetry, instead of other tools such as Pipenv. To install mogpe into an existing poetry environment add it as a dependency under [tool.poetry.dependencies] (in the ./pyproject.toml configuration file) with the following line:

mogpe = {path = "/path/to/mogpe"}

If you want to develop the mogpe codebase then set develop=true:

mogpe = {path = "/path/to/mogpe", develop=true}

The dependencies in a ./pyproject.toml file are resolved and installed with:

poetry install

If you do not require the development packages then you can opt to install without them,

poetry install --no-dev

Running Python scripts inside Poetry Environments

There are multiple ways to run code with Poetry and I advise checking out the documentation. My favourite option is to spawn a shell within the virtual environment:

poetry shell

and then python scripts can simply be run with:

python codey_mc_code_face.py

Alternatively, you can run scripts without spawning an instance of the virtual environment with the following command:

poetry run python codey_mc_code_face.py

I am much preferring using Poetry, however, it does feel quite slow doing some things and annoyingly doesn’t integrate that well with Read the Docs. A setup.py file is still needed for building the docs on Read the Docs, so I use Dephell to generate the requirements.txt and setup.py files from pyproject.toml with:

dephell deps convert --from=pyproject.toml --to=requirements.txt
dephell deps convert --from=pyproject.toml --to=setup.py

Install with pip

Create a new virtualenv and activate it, for example,

mkvirtualenv --python=python3 mogpe-env
workon mogpe-env

cd into the root of this package and install it and its dependencies with,

pip install .

If you want to develop the mogpe codebase then install it in “editable” or “develop” mode with:

pip install -e .

Usage

The model (and training with optional logging and checkpointing) can be configured using a TOML file. Please see the examples directory showing how to configure and train MixtureOfSVGPExperts on multiple data sets. See the notebooks (two experts and three experts) for how to define and train an instance of MixtureOfSVGPExperts without configuration files.

mogpe.mixture_of_experts

mogpe.mixture_of_experts contains an abstract base class for mixture of experts models as well as the main MixtureOfSVGPExperts class. The MixtureOfSVGPExperts class implements a variational lower bound for a mixture of Gaussian processes experts with a GP-based gating network. The MixtureOfExperts base class relies on composition and its constructor requires an instance of the GatingNetworkBase class and an instance of the ExpertsBase class (defined in gating_networks and experts respectively).

The abstract base classes outline what methods must be implemented for gating networks and experts to be valid; so that they can be used with a child of MixtureOfExperts. Please see the docs for more details on the gating_networks and experts.

mogpe.training

The training directory contains methods for three different training loops, saving and loading the model, and initialising the model (and training) from TOML config files.

Training Loops

The mogpe.training.training_loops directory contains three different training loops,

  1. A simple TensorFlow training loop,
  2. A monitoring tf training loop - a TensorFlow training loop with monitoring within tf.function(). This method only monitors the model parameters and loss (elbo) and does not generate images.
  3. A monitoring training loop - this loop generates images during training. The matplotlib functions cannot be inside the tf.function so this training loop should be slower but provide more insights.

To use Tensorboard cd to the logs directory and start Tensorboard,

cd /path-to-log-dir
tensorboard --logdir . --reload_multifile=true

Tensorboard can then be found by visiting http://localhost:6006/ in your browser.

Saving/Loading

mogpe.training.utils contains methods for loading and saving the model. See the examples for how to use.

TOML Config Parsers

mogpe.training.toml_config_parsers contains methods for 1) initialising the MixtureOfSVGPExperts class and 2) training it from a TOML config file. See the examples for how to use the TOML config parsers.

mogpe.helpers

The helpers directory contains classes to aid plotting models with 1D and 2D inputs. These are exploited by the monitored training loops.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.