Giter VIP home page Giter VIP logo

bayesopt's Introduction

BayesOpt: A Bayesian optimization library

BayesOpt is an efficient implementation of the Bayesian optimization methodology for nonlinear optimization, experimental design and hyperparameter tunning.

Bayesian optimization uses a distribution over functions to build a surrogate model of the unknown function for we are looking the optimum, and then apply some active learning strategy to select the query points that provides most potential interest or improvement. Thus, it is a sample efficient method for nonlinear optimization, design of experiments and simulations or bandits-like problems. Currently, it is being used in many scientific and industrial applications. In the literature it is also called Sequential Kriging Optimization (SKO), Sequential Model-Based Optimization (SMBO) or Efficient Global Optimization (EGO).

BayesOpt is licensed under the AGPL and it is free to use. However, if you use BayesOpt in a work that leads to a scientific publication, we would appreciate it if you would kindly cite BayesOpt in your manuscript.

Ruben Martinez-Cantin, BayesOpt: A Bayesian Optimization Library for Nonlinear Optimization, Experimental Design and Bandits. Journal of Machine Learning Research, 15(Nov):3735--3739, 2014.

The paper can be found at http://jmlr.org/papers/v15/martinezcantin14a.html

Commercial applications may also acquire a commercial license. Please contact [email protected] for details.

Getting and installing BayesOpt

The library can be download from Github: https://github.com/rmcantin/bayesopt

You can also get the cutting-edge version from the repositories:

>> git clone https://github.com/rmcantin/bayesopt

The online documentation can be found at: http://rmcantin.github.io/bayesopt/html/ where it includes a install guide.

Questions and issues


Copyright (C) 2011-2020 Ruben Martinez-Cantin [email protected]

BayesOpt is free software: you can redistribute it and/or modify it under the terms of the GNU Affero General Public License as published by the Free Software Foundation, version 3 of the License.

BayesOpt is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public License for more details.

You should have received a copy of the GNU Affero General Public License along with BayesOpt. If not, see http://www.gnu.org/licenses/.


bayesopt's People

Contributors

ericfrederich avatar jgbarcos avatar kiudee avatar nebw avatar rmcantin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

bayesopt's Issues

R wrapper

I realize this may be out-of-scope, but it's just out of my abilities to contribute myself and I hope a community member will take it up. An R wrapper for BayesOpt would be great. With Rcpp it should be (relatively) easy to wrap the C++ API, much like nloptr does for NLopt.

Python 3 support

There isn't much Python code in this repo at all, just demos really.
The interface is written in Cython which emits code that works with Python 2 or 3.

Because of this, it is trivial to support both 2 and 3 from the same code base.

Just some changes to print statements within the demos, and a simple bytes/strings conversion on the bopt parameters.

noise effect

About noise the instruction says: Too much noise results in slow convergence while not enough noise might result in not converging at all
Why is that?

compatibility with python3

According to the python code, it seems to be compatible with python 3 but when I tried to install it with python 3.5, and import bayesopt, the error occurred:

ImportError: dynamic module does not define module export function(PyInit_bayesopt)

Any idea to solve this, many thanks.

Mixed type optimization (continuous, discrete, categorical)

Have there been any thoughts of implementing an optimization routine (with interface to Python) that can optimize multiple types of parameters at once? This is often needed when optimizing machine learning algorithms. I'd really like to use it with deep learning, which has both continuous, discrete and categorical hyperparameters.

Access to the surrogate model through C API?

Hi

I like your package and wrote a little julia wrapper.

One thing I couldn't easily figure out was how to access the surrogate model after fitting (my C++ is a bit limited). Is there an easy way (ideally through functions similar to the ones in the current C API) to access the surrogate model (e.g. inspect kernel parameters, sampling from the model or evaluating mean and sigma for some inputs)?

Use BayesOPT to optimize categorical variables

Hey Ruben,
Sorry to disturb you. I have a question about categorical variables. My inputs are 8 binary variables (0/1).
Here is the running status and the results.

bayesopt1.txt
log1.txt
bayesopt2.txt
log2.txt

When "mParameters.noise" is small e.g. 1e-10, there is a error in log1.txt. But when "mParameters.noise" is equal to 1.0, there is not any error. Why did this happen ? This question has been bothering me for a long time. Have a favor.

Thanks a lot.

Cui

Returning minimum of mean instead of minimum sample

Currently the optimization routine returns the x_i with smallest y_i observed. While this makes sense with deterministic functions, it doesn't make that much sense with stochastic functions where the minimum of the mean doesn't always coincide with the sample minimum. For this reason, it would be good to have an option for retrieving the minimum of the mean function instead of the minimum sampled value.

Segfault for low discrete parameter space

I am not getting the DiscreteModel to run without a segmentation fault. (continuous model works fine)

It might be connected to the discrete parameter space.
The error can be reproduced by changing the number of discrete points (line 80) in examples/bo_disc.cpp:

const size_t nPoints = 1000  // large space - works fine
const size_t nPoints = 10  // small space - segfaults

The first output of valgrind:

==64059== Memcheck, a memory error detector
==64059== Copyright (C) 2002-2015, and GNU GPL'd, by Julian Seward et al.
==64059== Using Valgrind-3.12.0 and LibVEX; rerun with -h for copyright info
==64059== Command: ./bin/bo_disc
==64059==
Running C++ interface
- 12:30:51.665538 INFO: Expected 6 hyperparameters. Replicating parameters and prior.
- 12:30:51.728427 INFO: Using default parameters for criteria.
==64059== Invalid read of size 8
==64059==    at 0x43D25A: bayesopt::DiscreteModel::generateInitialPoints(boost::numeric::ublas::matrix<double, boost::numeric::ublas::basic_row_major<unsigned long, long>, boost::numeric::ublas::unbounded_array<double, std::allocator<double> > >&) (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==    by 0x442E43: bayesopt::BayesOptBase::initializeOptimization() (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==    by 0x44383C: bayesopt::BayesOptBase::optimize(boost::numeric::ublas::vector<double, boost::numeric::ublas::unbounded_array<double, std::allocator<double> > >&) (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==    by 0x43528C: main (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==  Address 0x5a94bd8 is 8 bytes after a block of size 240 alloc'd
==64059==    at 0x4C2A6F0: operator new(unsigned long) (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so)
==64059==    by 0x43D355: bayesopt::DiscreteModel::generateInitialPoints(boost::numeric::ublas::matrix<double, boost::numeric::ublas::basic_row_major<unsigned long, long>, boost::numeric::ublas::unbounded_array<double, std::allocator<double> > >&) (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==    by 0x442E43: bayesopt::BayesOptBase::initializeOptimization() (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==    by 0x44383C: bayesopt::BayesOptBase::optimize(boost::numeric::ublas::vector<double, boost::numeric::ublas::unbounded_array<double, std::allocator<double> > >&) (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==    by 0x43528C: main (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==

Compiler used: gcc / g++ 4.8.5

Why it doesn't converge to the right value

Hi
Thanks for viewing the question.
I run the bayes optimization for my problem and get the data below.
The first column is output y and second column is x
317.402,-0.278159//10 sample, 70 iteration
485.787,-0.325544
489.675,-0.256076
577.022,-0.244675
339.859,-0.310618
289.268,-0.282267
603.213,-0.241337
383.85,-0.314427
428.921,-0.265534
264.013,-0.296477// sample ends
296.115,-0.2817
297.359,-0.2817
296.15,-0.2817
295.788,-0.2817
296.233,-0.2817
296.172,-0.2817
295.955,-0.2817
295.969,-0.2817
295.711,-0.2817
295.864,-0.2817
296.037,-0.2817
296.08,-0.2817
.....

As you can see the value of x converge to -0.2817, however you can see when x = 0.296477, the output is smaller(264). In fact as I run my function by adding 0.001 to x from -0.33 to 0.23 I found when x is about -0.2917 y will output the smallest value.
In my problem the output is not deterministic because an algorithm called RANSAC but it will be near a certain value as you can see when x = -0.2817 y has different values but won't change too much. So what might be the problem? Why the bayesopt cannot find the minimum?
I use the default parameter.

Compatibility with python 3.6

I'm using Python 3.6.3 :: Anaconda custom (64-bit) - I get the following error, when I try running the examples -
File "demo_quad.py", line 22, in
import bayesopt
ImportError: dynamic module does not define module export function (PyInit_bayesopt)

I have tried re-generating the bayesopt.cpp with cython and rebuilding and installing the entire code base with the new cpp, no luck. I know that this issue cropped up(#10 and #11 ), but I still face the same problem with python 3.6.3,
thanks.
Kumar

example: Build + Install on Google Colabratory w/ Python 3.6

Hi Ruben,

Thank you for this very nice Bayesian Optimization library! It works very well, and has some well thought out defaults and features! :-)

I managed to get this running on Google Colaboratory in a Python 3.6 environment (after many wasted hours), and I just wanted to share how I did this. I have a collaborator who is stuck on Windows, and this looks like it could be a possible solution for us.

I'm not sure if this would be worth mentioning in your documentation as an option, but maybe another user would find this information helpful.

This is a fairly ugly, so I'd be interested to hear if anyone has any cleaner ways to get this up and running.

The following are the commands I used to install this within Google Colaboratory notebook:

!apt install libboost-dev cmake cmake-curses-gui g++ python3-dev libboost-dev cmake cmake-curses-gui g++ cython3 freeglut3-dev
rm -rf /usr/include/numpy
!ln -s /usr/local/lib/python3.6/dist-packages/numpy/core/include/numpy /usr/include/numpy
!git clone https://github.com/rmcantin/bayesopt
cd bayesopt/
!cmake -DBAYESOPT_PYTHON_INTERFACE=ON -DPYTHON_LIBRARY=/usr/lib/python3.6/config-3.6m-x86_64-linux-gnu/libpython3.6m.so -DPYTHON_INCLUDE_DIR=$(python-config --prefix)/include/python3.6 -DPYTHON_NUMPY_INCLUDE_DIR=/usr/lib/python3.6/dist-packages/numpy/core/include . && make && make install
cp /usr/lib/python2.7/dist-packages/bayes* /usr/lib/python3.6/
cd python/
%run demo_distance.py

And here is a sample notebook:
https://colab.research.google.com/drive/1ajWJGdrZCdfRML4O6Ltv2NpOE4w_oyFF

Very excited to optimize some functions now!

All the best,
CJ

Parallelization for matlab wrapper

Is there any way to parallelize the function evaluations for the matlab implementation of bayesoptcont()?

If not for the actual optimization (due to the sequential dependency), then at least for the initial function evaluations (set by n_init_samples)? If I can have 20 of those initial samples running simultaneously using HTcondor, it's pretty wasteful to do them serially.

About mSigma (variance) in gaussian_process.cpp

Dear Authors,

I am studying about gaussian process in LCB (Lower Confidence Bounds) evaluation. I take bo_branin.cpp as an example. I found variance is constantly lower than 1, which could be much smaller than mean value even in the early training step (for example, a gaussian model with only 2 samples trained. I tried my own example with mean value at around 170, but 0.2 variance with only 2 samples trained, where (170+-0.2) cannot contain the real value).

Through studying gaussian_process.cpp, I found that mSigma hugely decides the value of covariance. mSigma is set or gotten from function setHyperParameters and getHyperParameters in kernelRegressor.hpp, and is updated through kOptimizer in posterior_empirical.cpp. An initial 2-dimension points, mean and variance is combined into a 4-dimension vector to insert to kOptimizer. However, I doubt mean and variance is not calculated correctly as they are probably treated as same as the coordinates of 2-dimension points in kernelRegressor.hpp.

Could you help check about the calculation of variance? I feel interested into studying more.

Thanks,
Xianan

Using BayesOpt as an library

Hi,
I would like to ask how to integrate Bayesopt into my project.
I have compiled the Bayesopt alone successfully, and the examples in the bin/work fine too.

I try a minimal example as follows, where the include and lib are directly copied from the original Bayesopt after compilation, and the main.cpp here I tried is copied from 'bo_cont.cpp'. When I use this and compile, it does not run as in the Bayesopt. Are there any suggestions on how to integrate it into another project with libraries only?

Thanks for your kind help!
test.zip

Multicore processing

Hi,

is it possible to exploit the presence of multiple processor cores with BayesOpt ?

Regards,
Nik

Build error on Ubuntu 17.04

I cloned the repo, created a "build" directory inside and ran "cmake .." from within that directory. Finally, I attempted to run "make". The compilation seems to be going well till this point:

[ 71%] Building CXX object CMakeFiles/bayesopt.dir/utils/fileparser.cpp.o /home/vlad/tools/bayesopt/utils/fileparser.cpp: In member function ‘bool bayesopt::utils::FileParser::fileExists()’: /home/vlad/tools/bayesopt/utils/fileparser.cpp:88:23: error: cannot convert ‘std::ifstream {aka std::basic_ifstream<char>}’ to ‘bool’ in initialization bool result = ifile; ^~~~~ CMakeFiles/bayesopt.dir/build.make:734: recipe for target 'CMakeFiles/bayesopt.dir/utils/fileparser.cpp.o' failed make[2]: *** [CMakeFiles/bayesopt.dir/utils/fileparser.cpp.o] Error 1 CMakeFiles/Makefile2:67: recipe for target 'CMakeFiles/bayesopt.dir/all' failed make[1]: *** [CMakeFiles/bayesopt.dir/all] Error 2 Makefile:127: recipe for target 'all' failed make: *** [all] Error 2

bayesopt.optimize_discrete

BayesOpt keeps trying the same point. Why does it do it when it already knows the outcome of that point?
After a few iterations it forces one random sampling, but gets right back to the old point and gets stuck.

center of search space

When optimising multidimensional functions I find that the center of the upper and lower bounds are often sampled. This is the precise center every time which makes me think that there is a hard coded probability of sampling the center somewhere, but I've not found where.

The reason I'm asking is because I would like to benchmark this package and many of the benchmark functions have their optimum near or at the center of the search space. These are the settings I've used for benchmark purposes

sc_type = SC_MAP
l_type = L_MCMC
init_method=1
n_inner_iterations=5000
n_iter_relearn=1
l_all=1

I have also tried force_jump = 0 however I still have the same problem of sampling the precise center. For example, optimising a 20 dimensional Rosenbrock function in the full space (-5.0,5.0) for every dimension, performs better than reducing the search space to (-5.0,2.0) due to the center being sampled. Is there a way for me to turn this off to make the benchmark fairer?

Also if you have suggestions of settings (other than those I showed) for benchmark purposes I'd be happy to change them. The run time is not important, I'm interested in reducing the function value as much as possible per iteration.

function evaluation out of range

Hi:

I have encountered a problem with 'ERROR: Function evaluation out of range' even before the loss function haven't finished yet. Do you know what could cause this problem. Many thanks.

Best

Utilise Boost to provide some sort of clue for unknown errors

Hi there,

a) Thanks for providing bayesopt -- I am having fun with it and my multidimensional optimisation problems!

b) Please forgive this somewhat specific / longwinded way of asking for a diff to be merged, provided below, for what is undoubtedly technically a Matlab bug. I am providing a vast amount of information in case it helps someone else. On the Matlab (R2023a) interface, I was trying to optimise a 37-dimensional function that returns essentially the least squares difference of a simulation and experimental data. I had defined two functions: (i) optimFun(x) as an anonymous function, and AFullSimulation(x, parameters_1, parameters2, options,experimental_data) differently; that is AFullSimulation(...) was defined separately and optimFun = @(x) (AFullSimulation(x, Sys1, Exp, Options,...) was defined inline in the matlab file I was trying to run. Trying to call bayesoptcont(@optimFun,...) directly failed -- with "Error: -1" returned and ERROR: Unknown error in the log. I tried to understand this and tried iterations such as bayesoptcont('optimFun') etc but to no avail.

I therefore edited the wrapper function's generic, all-purpose catch(...) statement to provide a specific error class if one was available, utilising a feature in boostlib, which you are already linking against:

diff --git a/src/wrappers/bayesoptwpr.cpp b/src/wrappers/bayesoptwpr.cpp
index 2d38591..eeb13d3 100644
--- a/src/wrappers/bayesoptwpr.cpp
+++ b/src/wrappers/bayesoptwpr.cpp
@@ -26,6 +26,7 @@
 #include "log.hpp"
 #include "ublas_extra.hpp"
 #include "specialtypes.hpp"
+#include <boost/exception/diagnostic_information.hpp>

 static const int BAYESOPT_FAILURE = -1; /* generic failure code */
 static const int BAYESOPT_INVALID_ARGS = -2;
@@ -134,7 +135,8 @@ int bayes_optimization(int nDim, eval_func f, void* f_data,
     }
   catch (...)
     {
-      FILE_LOG(logERROR) << "Unknown error";
+      FILE_LOG(logERROR) << "Unknown error building continuous optimizer!" << std::endl;
+      FILE_LOG(logERROR) << boost::current_exception_diagnostic_information() << std::endl; /* Provide debug information if available */
       return BAYESOPT_FAILURE;
     }
   return 0; /* everything ok*/
@@ -192,7 +194,8 @@ int bayes_optimization_disc(int nDim, eval_func f, void* f_data,
     }
   catch (...)
     {
-      FILE_LOG(logERROR) << "Unknown error";
+      FILE_LOG(logERROR) << "Unknown error building discrete optimizer!" << std::endl;
+      FILE_LOG(logERROR) << boost::current_exception_diagnostic_information() << std::endl; /* Provide debugging info if it is there */
       return BAYESOPT_FAILURE;
     }

For my specific Matlab woe, this helpfully prints out the following:

[...] ERROR: Dynamic exception type: foundation::core::except::Exception<MATLAB::legacy_two_part::undefinedFunctionTextInputArgumentsType, std::exception, void> std::exception::what: Undefined function 'optimFun' for input arguments of type 'double'

Replacing @optimFun or 'optimFun' with @(x) AFullSimulation(x,...) (i.e. its full definition writ-large) appears to make Matlab JIT it into something that the mex interface is happy with.

I'm happy to submit this as a pull request if you like but thought I'd reach out first (!).

Problem with LHS in Python interface

bayesopt when using latin hypercube sampling seems to invoke intermediate callbacks resulting in more initial samples than intented.

Example input values received in callback function:
x = [ 0.30424932 0. 0. ]
x = [ 0.30424932 0.32077927 0. ]
x = [ 0.30424932 0.32077927 0.43445221]
It seems like it is filling the list one by one and invoking the callback in between.

Issue in MATLAB compilation

Hi
I have been trying to install this on windows platform in MATLAB 2018a. After building successfully using mingw32, matlab_compile.m shows error
MEX cannot find library 'bayesopt' specified with the -l option. MEX looks for a file with one of the names: libbayesopt.lib bayesopt.lib Please specify the path to this library with the -L option.

The lib files generated after building the source code are located in build/lib however they have extension of libbayesopt.a and libnlopt.a. The MATLAB are looking for .lib files. How can this issue be resolved.

Build Fails for BAYESOPT_BUILD_SHARED=ON

Hello,

First off, thank you for creating this library! I'm really excited to use it in my dissertation, but unfortunately I've been getting errors trying to build it. When I build it as static libraries (BAYESOPT_BUILD_SHARED=OFF), it works fine, but when I want shared libraries it doesn't.

I'm currently trying to implement this to as a distutils python extension, where the BO runs in a C++ file that receives input from a Python file. To do this, I need shared libraries to link in the Extension(), which unfortunately doesn't work. I've tried building and compiling with both MINGW32 and MSVC, both fail to compile with the following errors:


C:/msys64/ucrt64/bin/../lib/gcc/x86_64-w64-mingw32/12.1.0/../../../../x86_64-w64-mingw32/bin/ld.exe: CMakeFiles\branin_system_calls.dir/objects.a(branin_system_calls.cpp.obj):branin_system_calls.cpp:(.text$_ZN17SystemCallsBranin14evaluateSampleERKN5boost7numeric5ublas6vectorIdNS2_15unbounded_arrayIdSaIdEEEEE[_ZN17SystemCallsBranin14evaluateSampleERKN5boost7numeric5ublas6vectorIdNS2_15unbounded_arrayIdSaIdEEEEE]+0xf1): undefined reference to `bayesopt::utils::FileParser::FileParser(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, int)'
C:/msys64/ucrt64/bin/../lib/gcc/x86_64-w64-mingw32/12.1.0/../../../../x86_64-w64-mingw32/bin/ld.exe: CMakeFiles\branin_system_calls.dir/objects.a(branin_system_calls.cpp.obj):branin_system_calls.cpp:(.text$_ZN17SystemCallsBranin14evaluateSampleERKN5boost7numeric5ublas6vectorIdNS2_15unbounded_arrayIdSaIdEEEEE[_ZN17SystemCallsBranin14evaluateSampleERKN5boost7numeric5ublas6vectorIdNS2_15unbounded_arrayIdSaIdEEEEE]+0x78a): undefined reference to `bayesopt::utils::FileParser::openInput()'
C:/msys64/ucrt64/bin/../lib/gcc/x86_64-w64-mingw32/12.1.0/../../../../x86_64-w64-mingw32/bin/ld.exe: CMakeFiles\branin_system_calls.dir/objects.a(branin_system_calls.cpp.obj):branin_system_calls.cpp:(.text$_ZN17SystemCallsBranin14evaluateSampleERKN5boost7numeric5ublas6vectorIdNS2_15unbounded_arrayIdSaIdEEEEE[_ZN17SystemCallsBranin14evaluateSampleERKN5boost7numeric5ublas6vectorIdNS2_15unbounded_arrayIdSaIdEEEEE]+0x7e3): undefined reference to `bayesopt::utils::FileParser::read(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >)'
C:/msys64/ucrt64/bin/../lib/gcc/x86_64-w64-mingw32/12.1.0/../../../../x86_64-w64-mingw32/bin/ld.exe: CMakeFiles\branin_system_calls.dir/objects.a(branin_system_calls.cpp.obj):branin_system_calls.cpp:(.text$_ZN17SystemCallsBranin14evaluateSampleERKN5boost7numeric5ublas6vectorIdNS2_15unbounded_arrayIdSaIdEEEEE[_ZN17SystemCallsBranin14evaluateSampleERKN5boost7numeric5ublas6vectorIdNS2_15unbounded_arrayIdSaIdEEEEE]+0xb1d): undefined reference to `bayesopt::utils::FileParser::close()'
C:/msys64/ucrt64/bin/../lib/gcc/x86_64-w64-mingw32/12.1.0/../../../../x86_64-w64-mingw32/bin/ld.exe: CMakeFiles\branin_system_calls.dir/objects.a(branin_system_calls.cpp.obj):branin_system_calls.cpp:(.text$_ZN17SystemCallsBranin14evaluateSampleERKN5boost7numeric5ublas6vectorIdNS2_15unbounded_arrayIdSaIdEEEEE[_ZN17SystemCallsBranin14evaluateSampleERKN5boost7numeric5ublas6vectorIdNS2_15unbounded_arrayIdSaIdEEEEE]+0xb46): undefined reference to `bayesopt::utils::FileParser::~FileParser()'
C:/msys64/ucrt64/bin/../lib/gcc/x86_64-w64-mingw32/12.1.0/../../../../x86_64-w64-mingw32/bin/ld.exe: CMakeFiles\branin_system_calls.dir/objects.a(branin_system_calls.cpp.obj):branin_system_calls.cpp:(.text$_ZN17SystemCallsBranin14evaluateSampleERKN5boost7numeric5ublas6vectorIdNS2_15unbounded_arrayIdSaIdEEEEE[_ZN17SystemCallsBranin14evaluateSampleERKN5boost7numeric5ublas6vectorIdNS2_15unbounded_arrayIdSaIdEEEEE]+0xef5): undefined reference to `bayesopt::utils::FileParser::~FileParser()'
C:/msys64/ucrt64/bin/../lib/gcc/x86_64-w64-mingw32/12.1.0/../../../../x86_64-w64-mingw32/bin/ld.exe: CMakeFiles\branin_system_calls.dir/objects.a(branin_system_calls.cpp.obj):branin_system_calls.cpp:(.text.startup+0x2b): undefined reference to `bayesopt::Parameters::Parameters()'
C:/msys64/ucrt64/bin/../lib/gcc/x86_64-w64-mingw32/12.1.0/../../../../x86_64-w64-mingw32/bin/ld.exe: CMakeFiles\branin_system_calls.dir/objects.a(branin_system_calls.cpp.obj):branin_system_calls.cpp:(.text.startup+0xb4): undefined reference to `bayesopt::utils::ParamLoader::load(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, bayesopt::Parameters&)'
C:/msys64/ucrt64/bin/../lib/gcc/x86_64-w64-mingw32/12.1.0/../../../../x86_64-w64-mingw32/bin/ld.exe: CMakeFiles\branin_system_calls.dir/objects.a(branin_system_calls.cpp.obj):branin_system_calls.cpp:(.text.startup+0x498): undefined reference to `bayesopt::Parameters::Parameters(bopt_params)'
collect2.exe: error: ld returned 1 exit status
mingw32-make[2]: *** [examples\CMakeFiles\branin_system_calls.dir\build.make:101: bin/branin_system_calls.exe] Error 1
mingw32-make[1]: *** [CMakeFiles\Makefile2:190: examples/CMakeFiles/branin_system_calls.dir/all] Error 2
mingw32-make: *** [Makefile:135: all] Error 2

I'm not sure what the error is or how to fix it, any help is appreciated!

Thanks,
Hana

MultiObjective optimization

Hello all,
Can bayesopt library deal with multi-objective optimization cases as well? (I mean multi-output black-box functions)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.