Giter VIP home page Giter VIP logo

pyhmc's Introduction

pyhmc: Hamiltonian Monte Carlo in Python

Build Status License Docs

This package is a straight-forward port of the functions hmc2.m and hmc2_opt.m from the MCMCstuff matlab toolbox written by Aki Vehtari. The code is originally based on the functions hmc.m from the netlab toolbox written by Ian T Nabney. The portion of algorithm involving "windows" is derived from the C code for this function included in the Software for Flexible Bayesian Modeling written by Radford Neal.

The original Python port was made by Kilian Koepsell, and subsequently modernized by Robert T. McGibbon.

Authors

This software is distributed under the BSD License (see LICENSE file).

Example

If you wanted to draw samples from a 5 dimensional Gaussian, you would do something like:

import numpy as np
def logprob(x, ivar):
    logp = -0.5 * np.sum(ivar * x**2)
    grad = -ivar * x
    return logp, grad
from pyhmc import hmc
ivar = 1. / np.random.rand(5)
samples = hmc(logprob, x0=np.random.randn(5), args=(ivar,), n_samples=int(1e4))
# Using the beautiful $ pip install triangle_plot
import triangle
figure = triangle.corner(samples)
figure.savefig('triangle.png')

triangle

pyhmc's People

Contributors

rmcgibbo avatar koepsell avatar gwgundersen avatar kgourgou avatar mpharrigan avatar

Stargazers

Oleg Solopchuk avatar 把朕的python拿过来 avatar Qimin Wu avatar Hannes Vasyura-Bathke avatar Brendan Meade avatar Haipeng Li avatar  avatar  avatar Mijian Xu avatar  avatar Shreyas Fadnavis avatar Anubhav avatar Song Huang avatar Chaos avatar  avatar Shichao Wu avatar Brian Tobin avatar  avatar George Ho avatar Taygun Bulmus avatar  avatar  avatar J. Daniel Cardona avatar Siva avatar dynikon avatar Delsere avatar John avatar Orlando avatar Joon Ro avatar Vishal Belsare avatar Emre Şafak avatar Matthew Johnson avatar Carlos Hernández avatar  avatar  avatar

Watchers

 avatar James Cloos avatar Kyle Beauchamp avatar  avatar Shichao Wu avatar Taygun Bulmus avatar  avatar

pyhmc's Issues

bad interface

I found that this package is significant slower than the hmc in matlab. The reason is that the interface combines the function and its derivative together, i.e., the function needs to return both loglikelihood and its derivative. But in fact, hmc only need to evaluate the loglikelihood twice during its sampling.

The 0.5 scale in the leapfrog method in hmc_main_loop in _hmc.pyx

Hi Robert,

I found that in line 109 of the _hmc.pyx, there is an additional 0.5 scale in updating p (the full ), while as in standard HMC method (Neal's MCMC using Hamiltonian dynamics) there is no such term. May I ask if I have mis-understood anything?

Thanks,
Xuhui

Other correlation time estimators

  1. Sokal's notes (page 16), citing an earlier paper use an procedure for choosing the window for summing the empirical ACF where the length of the window, M is chosen to be the smallest value such that M is at least c times the estimated autocorrelation time, where c is something like 4, 6, or 10.
  2. Jenke also uses the same scheme, with c=6.
  3. Evertz doesn't seem to like this procedure (page 59)
  4. LaplacesDemon's in R uses the Geyer IPS estimator.
  5. Geyer's ICS estimator.

RuntimeWarning: invalid value encountered in log with Beta Distribution.

I'm trying to sample a beta distribution with a = n + 44 and b = s + 102, where n and s are constants, scale constant C = 0.01 and starting value p = 0.2 for 1000 iterations.

My beta looks like

g(p) = Cp^(a-1)*(1-p)^(b-1).
log(g) = log(C) + (a-1)log(p) + (b-1)log(1-p)
d(log(g)) = 0 + (a-1)/p - (b-1)/(1-p)

import pyhmc

sample_size = 31
sum_spacings = 43
starting_value = 0.2
num_iterations = 1000
constant_c = 0.01

def log_posterior_grad(p, n, s):
    log_posterior_beta = numpy.log(constant_c) + (n + 44 - 1)*numpy.log(p) + (s + 102 -1 )*numpy.log(1-p)
    gradient = (n + 44 - 1)/p - (s + 102 -1 )/(1-p)
    return log_posterior_beta, gradient

samples = pyhmc.hmc(fun = log_posterior_grad, x0 = [starting_value ], args = (sample_size, sum_spacings, ), n_steps = 1000)

Setup.py dependencies

The dependencies "statsmodels" and "cython" should be added to the install_requires in the setup.py script.

error in the README due to a numpy update

In the readme and the docs, the number of samples specified in the simple example is written as n_samples=1e4. With a recent numpy update (not sure which) it won't automatically convert a float to an int when you get to the line in hmc.py: samples = np.zeros((n_samples, n_params)). Thus, you get an error if you run the sample script as-is. If you change it it to n_samples=10000 it works fine.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.