Giter VIP home page Giter VIP logo

acerbilab / bads Goto Github PK

View Code? Open in Web Editor NEW
236.0 15.0 37.0 4.53 MB

Bayesian Adaptive Direct Search (BADS) optimization algorithm for model fitting in MATLAB

License: GNU General Public License v3.0

MATLAB 70.15% Mathematica 0.81% HTML 4.34% CSS 0.01% Makefile 0.26% C++ 3.15% Fortran 21.11% C 0.17%
optimization-algorithms bayesian-optimization log-likelihood noiseless-functions noisy-functions matlab

bads's Introduction

Bayesian Adaptive Direct Search (BADS) - v1.1.2

News

  • 31/Oct/22: BADS 1.1.1 released! Added full support for user-specified noise (e.g., for heteroskedastic targets) and several fixes.
  • If you are interested in Bayesian model fitting, check out Variational Bayesian Monte Carlo (VBMC), a simple and user-friendly toolbox for Bayesian posterior and model inference that we published at NeurIPS (2018, 2020).

What is it?

BADS is a fast hybrid Bayesian optimization algorithm designed to solve difficult optimization problems, in particular related to fitting computational models (e.g., via maximum likelihood estimation). The original BADS paper was presented at NeurIPS in 2017 [1].

BADS has been intensively tested for fitting behavioral, cognitive, and neural models, and is currently being used in many computational labs around the world. In our benchmark with real model-fitting problems, BADS performed on par or better than many other common and state-of-the-art MATLAB optimizers, such as fminsearch, fmincon, and cmaes [1].

BADS is recommended when no gradient information is available, and the objective function is non-analytical or noisy, for example evaluated through numerical approximation or via simulation.

BADS requires no specific tuning and runs off-the-shelf like other built-in MATLAB optimizers such as fminsearch.

Notes

  • If you are interested in estimating posterior distributions (i.e., uncertainty and error bars) over parameters, and not just point estimates, you might want to check out Variational Bayesian Monte Carlo, a toolbox for Bayesian posterior and model inference which can be used in synergy with BADS.
  • BADS is currently available only for MATLAB. A Python port, PyBADS, will be released soon (end of 2022).

Installation

Download the latest version of BADS as a ZIP file.

  • To install BADS, clone or unpack the zipped repository where you want it and run the script install.m.
    • This will add the BADS base folder to the MATLAB search path.
  • To see if everything works, run bads('test').

Quick start

The BADS interface is similar to that of other MATLAB optimizers. The basic usage is:

[X,FVAL] = bads(FUN,X0,LB,UB,PLB,PUB);

with input parameters:

  • FUN, a function handle to the objective function to minimize (typically, the negative log likelihood of a dataset and model, for a given input parameter vector);
  • X0, the starting point of the optimization (a row vector);
  • LB and UB, hard lower and upper bounds;
  • PLB and PUB, plausible lower and upper bounds, that is a box where you would expect to find almost all solutions.

The output parameters are:

  • X, the found optimum.
  • FVAL, the (estimated) function value at the optimum.

For more usage examples, see bads_examples.m. You can also type help bads to display the documentation.

For practical recommendations, such as how to set LB and UB, and any other question, check out the FAQ on the BADS wiki.

Note: BADS is a semi-local optimization algorithm, in that it can escape local minima better than many other methods โ€” but it can still get stuck. The best performance for BADS is obtained by running the algorithm multiple times from distinct starting points (see here).

How does it work?

BADS follows a mesh adaptive direct search (MADS) procedure for function minimization that alternates poll steps and search steps (see Fig 1).

  • In the poll stage, points are evaluated on a mesh by taking steps in one direction at a time, until an improvement is found or all directions have been tried. The step size is doubled in case of success, halved otherwise.
  • In the search stage, a Gaussian process (GP) is fit to a (local) subset of the points evaluated so far. Then, we iteratively choose points to evaluate according to a lower confidence bound strategy that trades off between exploration of uncertain regions (high GP uncertainty) and exploitation of promising solutions (low GP mean).

Fig 1: BADS procedure BADS procedure

See here for a visualization of several optimizers at work, including BADS.

See our paper for more details [1].

Troubleshooting

If you have trouble doing something with BADS:

This project is under active development. If you find a bug, or anything that needs correction, please let us know.

Reference

  1. Acerbi, L. & Ma, W. J. (2017). Practical Bayesian Optimization for Model Fitting with Bayesian Adaptive Direct Search. In Advances in Neural Information Processing Systems 30, pages 1834-1844. (link, arXiv preprint)

You can cite BADS in your work with something along the lines of

We optimized the log likelihoods of our models using Bayesian adaptive direct search (BADS; Acerbi and Ma, 2017). BADS alternates between a series of fast, local Bayesian optimization steps and a systematic, slower exploration of a mesh grid.

Besides formal citations, you can demonstrate your appreciation for BADS in the following ways:

  • Star the BADS repository on GitHub;
  • Follow Luigi Acerbi on Twitter for updates about BADS and other projects from the lab;
  • Tell us about your model-fitting problem and your experience with BADS (positive or negative) in the lab Discussions forum.

BibTex

@article{acerbi2017practical,
  title={Practical {B}ayesian Optimization for Model Fitting with {B}ayesian Adaptive Direct Search},
  author={Acerbi, Luigi and Ma, Wei Ji},
  journal={Advances in Neural Information Processing Systems},
  volume={30},
  pages={1834--1844},
  year={2017}
}

License

BADS is released under the terms of the GNU General Public License v3.0.

bads's People

Contributors

lacerbi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

bads's Issues

Parallel function evaluations?

Is it possible to evaluate multiple cost functions at once, then produce the next set of likely parameters to evaluate, a.k.a. batch updating the internal model state with parallel function evaluation?

enhance BADS robustness against uninformative parameters?

As many others, I use BADS to compare between distinct model mechanisms. In the course of this comparison I test adding and removing parameters that control these further mechanisms.

For practical reasons, it would be great if BADS would be robust against "unused" parameters in a model (that are used in related models). By unused I mean those that are disabled internally in the model, so the model behaves exactly the same no matter the value these parameters take.

I tested this in BADS and it is sensitive to "unused" parameters showing worse optimizations compared to models where these unused parameters are manually removed from optimization.

If it was only for practical reasons, I could understand closing this issue because it's of little conceptual benefit (as said, one can manually remove unused parameters from the list of parameters sent to BADS and thing solved).

However, conceptually, shouldn't the optimization algorithm (especially BADS for its philosophy) realize about uninformative parameters, and minimize spending "time" trying to optimize the value in their dimension?

Also, parameter "information" isn't usually binary as in the example above. But perhaps using this binary example above of used vs. unused parameters would help finding a better way by which BADS prioritizes the direction of convergence following the most "useful" parameters.

I think working on this issue would be interesting in terms of research and I wonder if you ever considered enhancing BADS further in this regard.

I didn't test, but this issue may affect VBMC as well.

other metaheuristics

Is it possible to change the evolution strategy (ES, CMAES) inside of the BADS by another metaheuristics?

Search in specified discrete combinatory space

Is it possible to specify discrete valid values for each dimension of the parameter (of the cost function) such that only those values on the grid are evaluated? A special case would be optimizing a single integer variable. An extension is being able to specify some dimensions of the parameter to be discrete and specify the discrete space over which it takes values. Some problems are discrete in nature, but combinatorics kills any attempt to do greedy grid search due to the curse of dimensionality.

One possible solution would be to put a map inside the evaluated function (logL) s.t. it maps continuous space into discrete one(e.g. p(1)=round(p(1)). But this wastes function evaluations for no good reason, and would be horrible for slow function evaluations.

Inaccurate error when all variables fixed

When bads is called with LB, UB, pLB, and pUB all equivalent (and thus fixing every dimension of the search), this error is returned:

Error using bads (line 343) If no starting point is provided, PLB and PUB need to be specified. Error in fixedbads (line 41) [x_free,fval] = bads(fun_fix,x0,LB,UB,PLB,PUB,nonbcon_fix,options); Error in bads (line 386)

The motivation for wanting to call BADS with every dimension fixed is to obtain the value of the objective function (and non linear constraint) from a nested function. This allows for a tidy arrangement of the code.

parallel function evaluations?

Is it possible to evaluate multiple cost functions at once, then produce the next set of likely parameters to evaluate, a.k.a. batch updating the internal model state with parallel function evaluation?

Shadowing "unwrap"

Loving BADS!

I use a matlab package manager (https://github.com/ToolboxHub/ToolboxToolbox) to handle path dependencies. Therefore, I do not make use of the BADS "install" function.

I have found that the BADs directory must be pre-pended to the matlab path to function. This is because BADs has the function "unwrap" (.../bads/gpml-matlab-v3.6-2015-07-07/util/unwrap.m) which shadows the built-in matlab function of the same name but has different functionality.

While I have been able to finesse the situation in my case, I thought you might want to know about this conflict and consider giving your function a different name to avoid this collision.

Best,

Geoff

warn ID for mesh overflow

I would like to be able to selectively silence the warning:

Warning: The mesh attempted to expand above maximum size too many times. Try widening PLB and PUB.

to do so, the warning needs to have an associated warning identifier. May I propose that line 1491 in the function bads.m be changed from:

warning('The mesh attempted to expand above maximum size too many times. Try widening PLB and PUB.');

to

warning('bads:meshOverflow', 'The mesh attempted to expand above maximum size too many times. Try widening PLB and PUB.');
(PS: I see that I could set options.MeshOverflowsWarning = Inf. My motivation for the suggested code change is to still record the presence of the warning, but suppress the report to the console)

Problems when moving plb & pub

Hello,

I guess there might be some potential problems when moving plb & pub.

The initialization of one parameter in my model:
lb: 6.14421235e-06
ub: 1.62754791e+05
plb: 0.04978707
pub: 20.08553692
X0: 10.067662

It then says: bads:InitialPointsTooClosePB: The starting points X0 are on or numerically too close to the hard bounds lb and ub. Moving the initial points more inside...

Now the value becomes:
X0: 162.75479756

It then says: bads:TooCloseBounds: For each variable, hard and plausible bounds should not be too close. Moving plausible bounds.

Now the value becomes:
LB_eff: 1.62754798e+02
UB_eff: 1.62592037e+05
plb: 1.62754798e+02
pub: 20.08553692

It then says: bads:InitialPointsOutsidePB. The starting points X0 are not inside the provided plausible bounds plb and pub. Expanding the plausible bounds...
ValueError: bads:StrictBounds: For each variable, hard and
plausible bounds should respect the ordering lb <= plb < pub <= ub.

Your help is greatly appreciated. Thanks!

More than 40 dimensions

Hi,

I found this bads really convenient to use. Really thanks for sharing this!

I am trying to use the functions to solve an optimisation problem with an input of more than 40 dimensions (about 100 dimensions in the worst case). When I run the code, I noticed that Matlab is showing the following error:

"I4_SOBOL - Fatal error!
The spatial dimension DIM_NUM should satisfy:
1 <= DIM_NUM <= 40"

I noticed that it is because the function 'i4_sobol.m' does not support more than 40 dimensions. I had a quick check online, and there is no updated Matlab function supporting more than 40 dimsension input. I wonder if there is any existing solution to this?
Thanks in advance!

Error when setting bounds equal to X0

I am encountering the following error when running bads_examples.m:

Function 'subsindex' is not defined for values of class 'struct'.

Error in bads>@(x,optimState,state)outputfun(expandvars(x,fixidx,fixedvars),optimState,state) (line 375)
outputfun_fix = @(x,optimState,state) outputfun(expandvars(x,fixidx,fixedvars),optimState,state);

Error in bads (line 429)
isFinished_flag = outputfun(origunits(u,optimState),optimState,'init');

Error in fixedbads (line 46)
[x_free,fval,exitflag,output] = bads(fun_fix,x0,LB,UB,PLB,PUB,nonbcon_fix,options);

Error in bads (line 382)
[x,fval,exitflag,output,optimState,gpstruct] =
fixedbads(fun_fix,x0,LB,UB,PLB,PUB,nonbcon_fix,options,fixidx,nargout);

Error in bads_examples (line 249)
[x,fval,exitflag,output] = bads(fun,x0,lb,ub,plb,pub,[],options,mu);

I obtain the same error when I call bads.m from my own routine and set a parameter of the upper/lower bounds to be equal to each other and to x0.

Running under Mac OS X 10.12.5, Matlab R2016b. The only code in the path (apart from various Matlab toolboxes) is BADS.

Less dependency?

The package has quite a few dependencies structured into a couple of directories. Would it be possible to skim down all nonessential dependencies, and use a standard directory structure like ./bin, ./lib/share, ./lib/ etc.?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.