Comments (20)
Do you have a reproducible example? How are you calling Ipopt?
from ipopt.jl.
The code that produces the error comes from a longer procedure. It involves a little data processing, running a few models to get approx. solutions / warmstarts to the lower level optimization problem, and then all of the iterations in the iterative solution procedure. The bilevel code in principal is similar to the following PR: joaquimg/BilevelJuMP.jl#184 (see https://github.com/joaquimg/BilevelJuMP.jl/blob/b0160b788bbe0e22dd7ce21b113cbb596f79e06d/docs/src/examples/Iterative_example1.jl for a simple example). Ipopt is also called like this (MOI), and the first iterations (sometimes all) run through perfectly fine.
I could share the full error producing repo if this helps, but setting up a MWE is not possible as the error is highly input specific. For example, if I specify different iter_eps values (basically a different RHS to the complementary slackness conditions, but still in the same order of magnitude), the whole procedure runs through without any errors.
Getting to the error will take a few hours/days of computation on a cluster computer though, which makes the situation even more impractical...
I could try to produce a full log file, but this will surely be a mess :D
from ipopt.jl.
It's going to be hard, if not impossible, to debug this without a reliable reproducible example.
from ipopt.jl.
Is there a good way to create a single reproducible model run? Maybe something like writing the MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}
to disk for every iteration before calling MOI.optimize!()
. Following a run on the cluster computer, I could share the iteration that produces an error. I was thinking something like serialization or jld, but not sure if that could potentially work?
from ipopt.jl.
In theory, MOI.write_to_file(model, "model.nl")
. But there could be any number of reasons why that wouldn't work. Mainly because the model we create by reading the file is not bit-for-bit similar to the one you have.
Does it happen with a solver other than MA97?
from ipopt.jl.
I think .nl files order variables (see https://www.ampl.com/wp-content/uploads/Hooking-Your-Solver-to-AMPL-by-David-M.-Gay.pdf). Since the problem seems related to some numerical situation, this will likely make a difference (?). Nevertheless, I will try...
It seems like MOI.copy_to() does not work for MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}
because:
ERROR: MathOptInterface.GetAttributeNotAllowed{MathOptInterface.ListOfModelAttributesSet}: Getting attribute MathOptInterface.ListOfModelAttributesSet() cannot be performed: Ipopt.Optimizer does not support getting the attribute MathOptInterface.ListOfModelAttributesSet(). You may want to use a `CachingOptimizer` in `AUTOMATIC` mode or you may need to call `reset_optimizer` before doing this operation if the `CachingOptimizer` is in `MANUAL` mode.
Is this intended (and is there another way except copying to MOI.FileFormats.Model()
)?
I have figured out a way around this, but I'm not 100% sure it won't mix things up a bit...
I did also send out another run with MA86. Usually different linear solvers produce different solutions/trajectories, so if it comes back without problems we cannot infer the error is related to MA97. But if we're lucky it will also produce an error. Then we could at least exclude the linear solvers from the list of likely candidates (they could still both have a similar issue, but this seems less likely...). Unfortunately, we will have to wait a bit for the results...
from ipopt.jl.
Ah. You probably need to use MOI.instantiate(Ipopt.Optimizer; with_bridge_type=Float64)
as the solver so that it is built with a cache.
from ipopt.jl.
In my case, this is how MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}
is instantiated (see https://github.com/joaquimg/BilevelJuMP.jl/blob/565c0ef6d5fd07ae7ff558bbf2466b87e815caf9/src/jump.jl#L829-L854). Do you have another idea?
My way of working around this was to copy a MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}
to MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}
in every iteration instead of directly reusing it (which arguably might make some difference...).
from ipopt.jl.
I can then write the CachingOptimizer
to a file
from ipopt.jl.
I don't understand. Where did MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}
come from?
Are you using BilevelJuMP or just MOI and Ipopt?
If you're using MOI, then use model = MOI.instantiate(Ipopt.Optimizer; with_bridge_type=Float64)
as your model.
from ipopt.jl.
Sorry, that was a bit confusing. I'm using an extended version of BilevelJuMP (that was the PR I had linked here: #341 (comment)). There, the solver is instantiated like this: optimizer=MOI.instantiate(optimizer_constructor; with_bridge_type = Float64)
and trying to copy it does not work.
The following MWE also produces the same error on my machine:
using MathOptInterface
using Ipopt
const MOI = MathOptInterface
optimizer = MOI.instantiate(Ipopt.Optimizer; with_bridge_type = Float64)
dest = MOI.FileFormats.Model(; filename = joinpath(pwd(),"tst_logs","tst.nl"))
MOI.copy_to(dest, optimizer)
MOI.write_to_file(dest, joinpath(pwd(),"tst_logs","_model.nl"))
The error message is:
ERROR: MathOptInterface.GetAttributeNotAllowed{MathOptInterface.ListOfModelAttributesSet}: Getting attribute MathOptInterface.ListOfModelAttributesSet() cannot be performed: Ipopt.Optimizer does not support getting the attribute MathOptInterface.ListOfModelAttributesSet(). You may want to use a `CachingOptimizer` in `AUTOMATIC` mode or you may need to call `reset_optimizer` before doing this operation if the `CachingOptimizer` is in `MANUAL` mode.
Stacktrace:
[1] get_fallback(model::Ipopt.Optimizer, attr::MathOptInterface.ListOfModelAttributesSet)
@ MathOptInterface ~/.julia/packages/MathOptInterface/Ht8hE/src/attributes.jl:406
[2] get(::Ipopt.Optimizer, ::MathOptInterface.ListOfModelAttributesSet)
@ MathOptInterface ~/.julia/packages/MathOptInterface/Ht8hE/src/attributes.jl:390
[3] get(b::MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}, attr::MathOptInterface.ListOfModelAttributesSet)
@ MathOptInterface.Bridges ~/.julia/packages/MathOptInterface/Ht8hE/src/Bridges/bridge_optimizer.jl:790
[4] copy_to(dest::MathOptInterface.FileFormats.NL.Model, model::MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer})
@ MathOptInterface.FileFormats.NL ~/.julia/packages/MathOptInterface/Ht8hE/src/FileFormats/NL/NL.jl:260
[5] top-level scope
@ Untitled-1:5
with:
MathOptInterface v1.10.0
Ipopt v1.1.0
Edit: forgot the const MOI line...
from ipopt.jl.
@odow Should the code above work, or am I approaching this from the wrong side?
from ipopt.jl.
Try:
optimizer = MOI.Utilities.CachingOptimizer(
MOI.Utilities.UniversalFallback(MOI.Utilities.Model{Float64}()),
MOI.instantiate(Ipopt.Optimizer; with_bridge_type = Float64),
)
from ipopt.jl.
Ok, so this is probably the best I can do...
My way of working around this was to copy a MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}} to MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer} in every iteration instead of directly reusing it (which arguably might make some difference...).
Will set this up and hopefully get back with a reproducible example...
from ipopt.jl.
Ok, I did a bit of testing on this and there might be a problem with MOI and .nl files.
The attached script works fine for primal variables, but ignores dual starts. I did also take a look at the MOI code for .nl files and could not find anything about ConstraintDualStart() there. Did I miss something?
using MathOptInterface
using Ipopt
src = MOI.FileFormats.Model(format = MOI.FileFormats.FORMAT_NL)
MOI.read_from_file(src, joinpath(pwd(),"_model_storage", "model.nl"))
solver = MOI.instantiate(Ipopt.Optimizer; with_bridge_type = Float64)
MOI.copy_to(solver, src)
MOI.set(solver, MOI.RawOptimizerAttribute("warm_start_init_point"), "yes")
MOI.set(solver, MOI.RawOptimizerAttribute("warm_start_bound_push"), 1e-12)
MOI.set(solver, MOI.RawOptimizerAttribute("warm_start_bound_frac"), 1e-12)
MOI.set(solver, MOI.RawOptimizerAttribute("warm_start_slack_bound_frac"), 1e-12)
MOI.set(solver, MOI.RawOptimizerAttribute("warm_start_slack_bound_push"), 1e-12)
MOI.set(solver, MOI.RawOptimizerAttribute("warm_start_mult_bound_push"), 1e-12)
MOI.set(solver, MOI.RawOptimizerAttribute("mu_init"), 1e-12)
MOI.set(solver, MOI.RawOptimizerAttribute("print_level"), 5)
MOI.optimize!(solver)
from ipopt.jl.
I did also take a look at the MOI code for .nl files and could not find anything about ConstraintDualStart() there
I don't think we support dual starts in the NL files yet.
from ipopt.jl.
I can try to write this up the next few days.
from ipopt.jl.
I can try to write this up the next few days.
Think I was a bit optimistic here...
.nl files are pretty messy to me and dual starts even more so. For example, I have no clue how to manage things like duals to variable bounds...
But instead, I managed to write an extension of the MOF format that correctly stores primal and dual starts.
On smaller test cases, the Ipopt runs appear to be reproducible...
If leaving out starts was not a design choice for MOF, I could also do a PR with the amendments to MOI.
from ipopt.jl.
Think I was a bit optimistic here... .nl files are pretty messy to me and dual starts even more so.
😆 I'm not surprised. NL files are pretty cryptic!
If leaving out starts was not a design choice for MOF, I could also do a PR with the amendments to MOI.
Not a design choice. Just something I didn't get around to. Please open a PR.
We'll also have to make changes to the schema: https://github.com/jump-dev/MathOptFormat
The place to add is somewhere:
https://github.com/jump-dev/MathOptFormat/blob/67e65785623330af60f7bbf2eab7f48d4580f322/schemas/mof.1.1.schema.json#L87-L107
but if you open a PR with your suggestion in MOI, I can show you how to change the schema
from ipopt.jl.
Closing this, I believe it is related to dlopen() when handling linear solvers.
Recently also got a segfault when using MA97 that was actually caused by the pardiso shared library...
from ipopt.jl.
Related Issues (20)
- Only evaluate QP callbacks if needed HOT 1
- Problem with HSL solvers since updating to MacOS 13.0 HOT 3
- Linking full hsl library to Ipopt.jl on ubuntu HOT 19
- julia 0.6.0 install Ipopt cannot connect to mumps dependency HOT 2
- Add support for GetIpoptCurrentViolations
- Incorrect number of Hessian structure (nonzero entries) HOT 10
- Does Ipopt.jl support giving hessian matrix in JuMP directly, without using the C_wrapper? HOT 2
- pointer being freed was not allocated HOT 33
- Invalid number in NLP function or derivative detected. HOT 8
- Issue with non-negative variable tolerance. HOT 2
- does not allow Ipopt_jll current version HOT 2
- Add some pre-built "debugging" callbacks HOT 1
- Crash on Windows when optimizing with SPRAL_jll HOT 11
- Using Ipopt in parallel threads HOT 4
- "double free or corruption" when providing duplicate hessian entries via MathOptInterface HOT 4
- Parsing an NLP HOT 1
- Ipopt does not print in Jupyter notebooks HOT 4
- Get number of iterations after optimization HOT 3
- Add way to change AD backend
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from ipopt.jl.