Comments (7)
So if you have defined x_guesses
in the problem, use pypesto.optimize.ScipyDifferentialEvolutionOptimizer
and call pypesto.optimize.minimze
, the following will happen:
- in pypesto/optimize/optimize.py line 89-93, if you have not defined a startpointmethod, it will use uniform as start point method, which of the class
FunctionStartpoints
, a subclass ofCheckedStartpoints
. - In lines 105-109 of optimize, the start points are assigned, which uses as a call argument the problem.
- This call function, given you use default options will return
x_guesses
, which will then be supplied to the optimizer as argument ofx_0
here
I think that is the argument line here. I hope this helps in clarifying it :)
from pypesto.
So if you have defined
x_guesses
in the problem, usepypesto.optimize.ScipyDifferentialEvolutionOptimizer
and callpypesto.optimize.minimze
, the following will happen:
- in pypesto/optimize/optimize.py line 89-93, if you have not defined a startpointmethod, it will use uniform as start point method, which of the class
FunctionStartpoints
, a subclass ofCheckedStartpoints
.- In lines 105-109 of optimize, the start points are assigned, which uses as a call argument the problem.
- This call function, given you use default options will return
x_guesses
, which will then be supplied to the optimizer as argument ofx_0
hereI think that is the argument line here. I hope this helps in clarifying it :)
Though I might have to correct myself, I think it is not passed on, as it is lost in the minimise function. Thanks a lot for the question, will fix that and also clarify in the documentation.
from pypesto.
will also have a look at the PySwarms optimizer 👍🏼
from pypesto.
Many thanks 🙏🏼 A fix for the ScipyDifferentialEvolutionOptimizer
as well as at least a warning for the pyswarms optimizer will be released within this week :)
from pypesto.
merged at least warning for optimizers in #1027.
from pypesto.
This does help, but I have been doing the same thing, walking through code to figure out how potentially conflicting options are handled. For instance, I thought that it would send the x_guesses
in the PyswarmsOptimizer
to act as init_pos
for the particles in the swarm. But instead first x0
is set by startpoint
.
pyPESTO/pypesto/optimize/optimize.py
Lines 133 to 141 in 13ce29b
Then x0
is used only to define dimension
, not init_pos
.
pyPESTO/pypesto/optimize/optimizer.py
Lines 886 to 891 in 13ce29b
I think some documentation describing options could be useful, but if it's too much work we can always figure things out by looking at src.
from pypesto.
@PaulJonasJost I've taken some notes on my exploration. They might be useful to you. Excuse them for their brevity. I think the last example showing generate_swarm
will be informative.
- Say we set
pypesto.problem.x_guesses
($n \times d$ array). Then the following loop will occur$n$ times, with random starts when using thePySwarms
optimizer. This works well as we will have particles starting in new locations.
for startpoint, id in zip(startpoints, ids):
task = OptimizerTask(
optimizer=optimizer,
problem=problem,
x0=startpoint,
id=id,
history_options=history_options,
optimize_options=options,
)
tasks.append(task)
-
If the PySwarms option
init_pos=x_guesses
then there will still be$n$ loops but then the number of particlespar_popsize
will be ignored, and belen(x_guesses)
, and the initialization will be the same for each new set of particles. This would not be desireable for properly running the optimizer. Because of this is probably best to advise against usinginit_pos
. -
side note:
x0
is evaluated for bothinit_pos=x_guesses
and the defaultinit_pos=None
. -
I've tested how particles are generated to show this. If we specify the
init_pos
,n_particles
is overwritten and we only get an initialization of the size ofinit_pos
.
>>> from pyswarms.backend.generators import generate_swarm
>>> init_pos = np.ones((2,3))
>>> generate_swarm(n_particles=4, dimensions=init_pos.shape[1],init_pos=init_pos)
array([[1., 1., 1.],
[1., 1., 1.]])
The dimension
arg needs to be specified, but is overwritten by kwarg init_pos
.
>>> generate_swarm(n_particles=4, dimensions=5,init_pos=init_pos)
array([[1., 1., 1.],
[1., 1., 1.]])
init_pos=None
is normal default behavior, with random initialization of particles. Doing this we actually get 4 particles.
>>> generate_swarm(n_particles=4, dimensions=init_pos.shape[1],init_pos=None)
array([[0.17696656, 0.03424968, 0.55395097],
[0.85859682, 0.5250245 , 0.41581642],
[0.64526319, 0.04922589, 0.31136249],
[0.53644966, 0.48004648, 0.32024561]])
from pypesto.
Related Issues (20)
- ImportError: cannot import name 'gaussian' from 'scipy.signal' HOT 5
- What's the purpose of `Ensemble.predictions`? HOT 1
- General Discussion regarding Ensemble HOT 7
- Handling of simulation failures in AmiciPredictor
- Frequent timeouts of RTD builds HOT 3
- Waterfall plot error: `order_by_id` and `start_indices` do not work together
- Optimization with conditions specific for observables fails HOT 4
- Frequent timeouts for `mac (3.11)` workflow runs HOT 3
- Check pypesto/sample/geweke_test.py calculate_zscore/spectrum0/spectrum
- Frequent readthedocs timeout HOT 1
- `pyjulia` is no longer maintained
- Roadrunner petab test case 0018 fails on Mac HOT 3
- Lazy loading of optimization results from HDF5
- Aggregated Objective does not return expected outputs using `return_dict` HOT 7
- NegLogParameterPriors evaluates prior density also on fixed parameters HOT 6
- AmiciObjective: For sensi_orders>0, check that sensitivities wrt all relevant parameters are computed
- PetabImporter does not respect estimate=0 HOT 3
- KeyError for parameter with amici prefix HOT 2
- Lots of warnings in test suite HOT 2
- Using PyPesto with jax HOT 11
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pypesto.