Giter VIP home page Giter VIP logo

adnlpmodels.jl's Introduction

ADNLPModels

DOI GitHub release codecov

CI Cirrus CI - Base Branch Build Status

This package provides automatic differentiation (AD)-based model implementations that conform to the NLPModels API. The general form of the optimization problem is

$$\begin{aligned} \min \quad & f(x) \\\ & c_L \leq c(x) \leq c_U \\\ & \ell \leq x \leq u, \end{aligned}$$

How to Cite

If you use ADNLPModels.jl in your work, please cite using the format given in CITATION.bib.

Installation

ADNLPModels is a   Julia Language   package. To install ADNLPModels, please open Julia's interactive session (known as REPL) and press ] key in the REPL to use the package mode, then type the following command

pkg> add ADNLPModels

Examples

For optimization in the general form, this package exports two constructors ADNLPModel and ADNLPModel!.

using ADNLPModels

f(x) = 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2
T = Float64
x0 = T[-1.2; 1.0]
# Rosenbrock
nlp = ADNLPModel(f, x0) # unconstrained

lvar, uvar = zeros(T, 2), ones(T, 2) # must be of same type than `x0`
nlp = ADNLPModel(f, x0, lvar, uvar) # bound-constrained

c(x) = [x[1] + x[2]]
lcon, ucon = -T[0.5], T[0.5]
nlp = ADNLPModel(f, x0, lvar, uvar, c, lcon, ucon) # constrained

c!(cx, x) = begin
  cx[1] = x[1] + x[2]
  return cx
end
nlp = ADNLPModel!(f, x0, lvar, uvar, c!, lcon, ucon) # in-place constrained

It is possible to distinguish between linear and nonlinear constraints, see .

This package also exports the constructors ADNLSModel and ADNLSModel! for Nonlinear Least Squares (NLS), i.e. when the objective function is a sum of squared terms.

using ADNLPModels

F(x) = [10 * (x[2] - x[1]^2); x[1] - 1]
nequ = 2 # length of Fx
T = Float64
x0 = T[-1.2; 1.0]
# Rosenbrock in NLS format
nlp = ADNLSModel(F, x0, nequ)

The resulting models, ADNLPModel and ADNLSModel, are instances of AbstractNLPModel and implement the NLPModel API, see NLPModels.jl.

We refer to the documentation for more details on the resulting models, and you can find tutorials on jso.dev/tutorials/ and select the tag ADNLPModel.jl.

AD backend

The following AD packages are supported:

  • ForwardDiff.jl;
  • ReverseDiff.jl;

and as optional dependencies (you must load the package before):

  • Enzyme.jl;
  • SparseDiffTools.jl;
  • Symbolics.jl;
  • Zygote.jl.

Bug reports and discussions

If you think you found a bug, feel free to open an issue. Focused suggestions and requests can also be opened as issues. Before opening a pull request, start an issue or a discussion on the topic, please.

If you want to ask a question not suited for a bug report, feel free to start a discussion here. This forum is for general discussion about this repository and the JuliaSmoothOptimizers, so questions about any of our packages are welcome.

adnlpmodels.jl's People

Contributors

abelsiqueira avatar amontoison avatar dpo avatar github-actions[bot] avatar jsobot avatar juliatagbot avatar mohamed82008 avatar monssaftoukal avatar probot-auto-merge[bot] avatar staticfloat avatar tmigot avatar vepiteski avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

adnlpmodels.jl's Issues

Support more of Enzyme

Hey there!

I've been using your packages to work on an inverse design problem for a paper (and my thesis) with great success. I'm trying to squeeze some more performance out of the solver, though. So, I've read Enzyme can be more performant than the default ForwardDiff. Enzyme supports both forward and reverse mode, as well as jacobians and hessians (via forward over reverse). Currently, ADNLPModels only support Enzyme's reverse mode. It seems it would be useful to support all the features Enzyme has to offer has a higher-performance alternative to the currently defined AD backends.

Warnings about unused type parameters

https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl/actions/runs/5090752814/jobs/9149961129#step:6:385

┌ ADNLPModels [54578032-b7ea-4c30-94aa-7cbd1cce6c9a]
│  WARNING: method definition for #ADNLSModel!#299 at /home/runner/work/ADNLPModels.jl/ADNLPModels.jl/src/nls.jl:149 declares type variable AD but does not use it.
│  WARNING: method definition for #ADNLSModel#335 at /home/runner/work/ADNLPModels.jl/ADNLPModels.jl/src/nls.jl:570 declares type variable AD but does not use it.
│  WARNING: method definition for #ADNLSModel!#338 at /home/runner/work/ADNLPModels.jl/ADNLPModels.jl/src/nls.jl:600 declares type variable AD but does not use it.
└  

Unit tests segfault on M1 Mac

signal (11): Segmentation fault: 11
in expression starting at /Users/dpo/.julia/packages/ADNLPModels/0WWOF/test/nlp/nlpmodelstest.jl:2
ntuple at ./ntuple.jl:0
unknown function (ip: 0x168a037b3)
jl_apply_generic at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
getindex at ./range.jl:373
hvcat at /Users/sabae/src/julia/usr/share/julia/stdlib/v1.7/SparseArrays/src/sparsevector.jl:1110
jl_apply_generic at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
#hess#28 at /Users/dpo/.julia/packages/NLPModelsModifiers/CLqxQ/src/slack-model.jl:329
hess at /Users/dpo/.julia/packages/NLPModelsModifiers/CLqxQ/src/slack-model.jl:325
jl_apply_generic at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
#consistent_functions#117 at /Users/dpo/.julia/packages/NLPModelsTest/iQzt6/src/nlp/consistency.jl:174
consistent_functions##kw at /Users/dpo/.julia/packages/NLPModelsTest/iQzt6/src/nlp/consistency.jl:92
unknown function (ip: 0x1689f3657)
jl_apply_generic at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
#consistent_nlps#107 at /Users/dpo/.julia/packages/NLPModelsTest/iQzt6/src/nlp/consistency.jl:57
consistent_nlps##kw at /Users/dpo/.julia/packages/NLPModelsTest/iQzt6/src/nlp/consistency.jl:24
unknown function (ip: 0x1054a15b3)
jl_apply_generic at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
macro expansion at /Users/dpo/.julia/packages/ADNLPModels/0WWOF/test/nlp/nlpmodelstest.jl:13 [inlined]
macro expansion at /Users/sabae/src/julia/usr/share/julia/stdlib/v1.7/Test/src/Test.jl:1283 [inlined]
macro expansion at /Users/dpo/.julia/packages/ADNLPModels/0WWOF/test/nlp/nlpmodelstest.jl:13 [inlined]
macro expansion at /Users/sabae/src/julia/usr/share/julia/stdlib/v1.7/Test/src/Test.jl:1283 [inlined]
macro expansion at /Users/dpo/.julia/packages/ADNLPModels/0WWOF/test/nlp/nlpmodelstest.jl:5 [inlined]
top-level scope at /Users/sabae/src/julia/usr/share/julia/stdlib/v1.7/Test/src/Test.jl:1359
jl_toplevel_eval_flex at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
jl_toplevel_eval_flex at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
jl_toplevel_eval_in at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
eval at ./boot.jl:373 [inlined]
include_string at ./loading.jl:1196
jl_apply_generic at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
_include at ./loading.jl:1253
include at ./client.jl:451
jl_apply_generic at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
do_call at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
eval_body at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
jl_interpret_toplevel_thunk at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
jl_toplevel_eval_flex at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
jl_toplevel_eval_flex at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
jl_toplevel_eval_in at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
eval at ./boot.jl:373 [inlined]
include_string at ./loading.jl:1196
jl_apply_generic at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
_include at ./loading.jl:1253
include at ./client.jl:451
jl_apply_generic at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
do_call at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
eval_body at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
jl_interpret_toplevel_thunk at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
jl_toplevel_eval_flex at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
jl_toplevel_eval_flex at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
jl_toplevel_eval_in at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
eval at ./boot.jl:373 [inlined]
exec_options at ./client.jl:268
_start at ./client.jl:495
jfptr__start_41470 at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/sys.dylib (unknown line)
jl_apply_generic at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
true_main at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
jl_repl_entrypoint at /Applications/Julia-1.7.app/Contents/Resources/julia/lib/julia/libjulia-internal.1.7.dylib (unknown line)
Allocations: 582754680 (Pool: 582550043; Big: 204637); GC: 499
ERROR: Package ADNLPModels errored during testing (exit code: 139)

"UndefVarError: `ZygoteAD` not defined" when trying to use Zygote as AD backend

I'm trying to switch the AD backend to Zygote.jl, but I keep getting "ZygoteAD not defined", even though I followed the ADNLPModels.jl docs.

The README says:

- `Zygote.jl`: you must load `Zygote.jl` separately and pass `ADNLPModels.ZygoteAD()` as the `adbackend` keyword argument to the `ADNLPModel` or `ADNLSModel` constructor.

The documentation provides this example:

using ADNLPModels
...
using Zygote
ADNLPModel(f, x0; backend = ADNLPModels.ZygoteAD)

So I write code as in the docs, but get the error:

julia> using Zygote # Load Zygote separately

julia> import ADNLPModels

julia> ADNLPModels.Zygote # ZygoteAD not listed
Zygote            ZygoteADGradient  ZygoteADHessian   ZygoteADJacobian  ZygoteADJprod
ZygoteADJtprod
julia> ADNLPModels.ZygoteAD
ERROR: UndefVarError: `ZygoteAD` not defined
Stacktrace:
 [1] getproperty(x::Module, f::Symbol)
   @ Base ./Base.jl:31
 [2] top-level scope
   @ REPL[3]:1

julia> 

Versions:

  • ADNLPModels v0.6.1
  • Zygote v0.6.60
  • Julia 1.9.0-rc2 (2023-04-01)

How to switch the AD backend to Zygote?

Improve docs

Documentation doesn't show how to change backend.

Sparse Jacobian/Hessian not GPU-compatible

See the following tests:

  • multiple_precision_nls_array(T -> nls_from_T(T; jacobian_backend = ADNLPModels.ForwardDiffADJacobian, hessian_backend = ADNLPModels.ForwardDiffADHessian, jacobian_residual_backend = ADNLPModels.ForwardDiffADJacobian, hessian_residual_backend = ADNLPModels.ForwardDiffADHessian), CuArray, exclude = [jprod, jprod_residual, hprod_residual], linear_api = true)
  • multiple_precision_nlp_array(T -> nlp_from_T(T; jacobian_backend = ADNLPModels.ForwardDiffADJacobian, hessian_backend = ADNLPModels.ForwardDiffADHessian), CuArray, exclude = [jth_hprod, hprod, jprod], linear_api = true)
  • multiple_precision_nls_array(T -> nls_from_T(T; jacobian_backend = ADNLPModels.ForwardDiffADJacobian, hessian_backend = ADNLPModels.ForwardDiffADHessian, jacobian_residual_backend = ADNLPModels.ForwardDiffADJacobian, hessian_residual_backend = ADNLPModels.ForwardDiffADHessian), CuArray, exclude = [jprod, jprod_residual, hprod_residual], linear_api = true)
  • multiple_precision_nlp_array(T -> nlp_from_T(T; jacobian_backend = ADNLPModels.ForwardDiffADJacobian, hessian_backend = ADNLPModels.ForwardDiffADHessian), CuArray, exclude = [jth_hprod, hprod, jprod], linear_api = true)

    A MWE:
using CUDA, ADNLPModels, NLPModels, Symbolics

hs6_autodiff(::Type{T}; kwargs...) where {T <: Number} = hs6_autodiff(Vector{T}; kwargs...)
function hs6_autodiff(::Type{S} = Vector{Float64}; kwargs...) where {S}
  x0 = S([-12 // 10; 1])
  f(x) = (1 - x[1])^2
  c(x) = [10 * (x[2] - x[1]^2)]
  lcon = fill!(S(undef, 1), 0)
  ucon = fill!(S(undef, 1), 0)

  return ADNLPModel(f, x0, c, lcon, ucon, name = "hs6_autodiff"; kwargs...)
end
S = CuArray{Float64}
function c!(cx, x)
    cx .= [10 * (x[2] - x[1]^2)]
    return cx
end
x0 = S([-12 // 10; 1])
output = similar(x0, 1)
# nlp = hs6_autodiff(CuArray{Float64})
# ADNLPModels.SparseADJacobian(2, x -> (1 - x[1])^2, 1, c!, x0 = x0)
# J = ADNLPModels.compute_jacobian_sparsity(c!, output, x0)
J = Symbolics.jacobian_sparsity(c!, cx, x0)

Add documentation on sparse jacobian

In particular, for the coloration, we also have other algorithms (see unit tests):

  • GreedyD1Color ;
  • BacktrackingColor;
  • ContractionColor;
  • GreedyStar1Color;
  • GreedyStar2Color;
  • AcyclicColoring.

I think that we can easily use the color function in the future that I interfaced last week in CUDA.jl.
I will open an issue in SparseDiffTools when it will be merged.

Originally posted by @amontoison in #105 (comment)

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

hprod/jprod not GPU-compatible

See the following tests:

  • multiple_precision_nls_array(T -> nls_from_T(T; jacobian_backend = ADNLPModels.ForwardDiffADJacobian, hessian_backend = ADNLPModels.ForwardDiffADHessian, jacobian_residual_backend = ADNLPModels.ForwardDiffADJacobian, hessian_residual_backend = ADNLPModels.ForwardDiffADHessian), CuArray, exclude = [jprod, jprod_residual, hprod_residual], linear_api = true)
  • multiple_precision_nlp_array(T -> nlp_from_T(T; jacobian_backend = ADNLPModels.ForwardDiffADJacobian, hessian_backend = ADNLPModels.ForwardDiffADHessian), CuArray, exclude = [jth_hprod, hprod, jprod], linear_api = true)
  • multiple_precision_nls_array(T -> nls_from_T(T; jacobian_backend = ADNLPModels.ForwardDiffADJacobian, hessian_backend = ADNLPModels.ForwardDiffADHessian, jacobian_residual_backend = ADNLPModels.ForwardDiffADJacobian, hessian_residual_backend = ADNLPModels.ForwardDiffADHessian), CuArray, exclude = [jprod, jprod_residual, hprod_residual], linear_api = true)
  • multiple_precision_nlp_array(T -> nlp_from_T(T; jacobian_backend = ADNLPModels.ForwardDiffADJacobian, hessian_backend = ADNLPModels.ForwardDiffADHessian), CuArray, exclude = [jth_hprod, hprod, jprod], linear_api = true)

    A MWE:
using CUDA, ADNLPModels, NLPModels

hs6_autodiff(::Type{T}; kwargs...) where {T <: Number} = hs6_autodiff(Vector{T}; kwargs...)
function hs6_autodiff(::Type{S} = Vector{Float64}; kwargs...) where {S}
  x0 = S([-12 // 10; 1])
  f(x) = (1 - x[1])^2
  c(x) = [10 * (x[2] - x[1]^2)]
  lcon = fill!(S(undef, 1), 0)
  ucon = fill!(S(undef, 1), 0)

  return ADNLPModel(f, x0, c, lcon, ucon, name = "hs6_autodiff"; kwargs...)
end

nlp = hs6_autodiff(CuArray{Float64})
CUDA.allowscalar()
jth_hprod(nlp, nlp.meta.x0, nlp.meta.x0, 1) # same for hprod(nlp, nlp.meta.x0, nlp.meta.x0)
#=
ERROR: GPU compilation of MethodInstance for (::GPUArrays.var"#map_kernel#38"{…})(::CUDA.CuKernelContext, ::CuDeviceVector{…}, ::Base.Broadcast.Broadcasted{…}, ::Int64) failed
KernelError: passing and using non-bitstype argument

Argument 4 to your kernel function is of type Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1}, Tuple{Base.OneTo{Int64}}, ForwardDiff.var"#85#86"{ForwardDiff.Tag{ADNLPModels.var"#lag#141"{Int64, var"#f#6", Int64, ADNLPModels.var"#c!#319"{var"#c#7"}, ADNLPModels.var"#lag#134#142"}, Float64}}, Tuple{Base.Broadcast.Extruded{Vector{ForwardDiff.Dual{ForwardDiff.Tag{ADNLPModels.var"#lag#141"{Int64, var"#f#6", Int64, ADNLPModels.var"#c!#319"{var"#c#7"}, ADNLPModels.var"#lag#134#142"}, Float64}, Float64, 1}}, Tuple{Bool}, Tuple{Int64}}}}, which is not isbits:
  .args is of type Tuple{Base.Broadcast.Extruded{Vector{ForwardDiff.Dual{ForwardDiff.Tag{ADNLPModels.var"#lag#141"{Int64, var"#f#6", Int64, ADNLPModels.var"#c!#319"{var"#c#7"}, ADNLPModels.var"#lag#134#142"}, Float64}, Float64, 1}}, Tuple{Bool}, Tuple{Int64}}} which is not isbits.
    .1 is of type Base.Broadcast.Extruded{Vector{ForwardDiff.Dual{ForwardDiff.Tag{ADNLPModels.var"#lag#141"{Int64, var"#f#6", Int64, ADNLPModels.var"#c!#319"{var"#c#7"}, ADNLPModels.var"#lag#134#142"}, Float64}, Float64, 1}}, Tuple{Bool}, Tuple{Int64}} which is not isbits.
      .x is of type Vector{ForwardDiff.Dual{ForwardDiff.Tag{ADNLPModels.var"#lag#141"{Int64, var"#f#6", Int64, ADNLPModels.var"#c!#319"{var"#c#7"}, ADNLPModels.var"#lag#134#142"}, Float64}, Float64, 1}} which is not isbits.
=#

Bug sparse hessian with conditionals

There are at least two from OptimizationProblems.jl

function AMPGO07(; n::Int = default_nvar, type::Val{T} = Val(Float64), kwargs...) where {T}
  function f(x)
    return x[1] <= 0 ? convert(eltype(x), Inf) :
           sin(x[1]) + sin(10 // 3 * x[1]) + log(abs(x[1])) - 84 // 100 * x[1] + 3
  end
  x0 = T(2.7) * ones(T, 1)
  return ADNLPModels.ADNLPModel(f, x0, name = "AMPGO07"; kwargs...)
end

and

function AMPGO13(; n::Int = default_nvar, type::Val{T} = Val(Float64), kwargs...) where {T}
  function f(x)
    n = length(x)
    return 0 < x[1] < 1 ? -(x[1]^(2 // 3) + (1 - x[1]^2)^(1 // 3)) : convert(eltype(x), Inf)
  end
  x0 = T[1 / 1000]
  return ADNLPModels.ADNLPModel(f, x0, name = "AMPGO13"; kwargs...)
end

that are not continuous though

Improve show

The show method could use some improvements. Here's some sample output:

julia> model = hovercraft1d()
ADNLPModel - Model with automatic differentiation backend ADNLPModels.ADModelBackend{ADNLPModels.ForwardDiffADGradient, ADNLPModels.ForwardDiffADHvprod, ADNLPModels.ForwardDiffADJprod, ADNLPModels.ForwardDiffADJtprod, ADNLPModels.ForwardDiffADJacobian, ADNLPModels.ForwardDiffADHessian, ADNLPModels.ForwardDiffADGHjvprod}(ADNLPModels.ForwardDiffADGradient(ForwardDiff.GradientConfig{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}, Float64, 11, Vector{ForwardDiff.Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}, Float64, 11}}}((Partials(1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Partials(0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Partials(0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Partials(0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Partials(0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0), Partials(0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0), Partials(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0), Partials(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0), Partials(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0), Partials(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0), Partials(0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0)), ForwardDiff.Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}, Float64, 11}[Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.5112686117e-314,8.487983164e-314,-3.713492635e-313,-4.13789179327e-313,8.487983164e-314,5.5112686315e-314,8.487983164e-314,5.21501685588737e-310,4.243991582e-314,0.0,5.5112686513e-314,8.4879831644e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(4.243991689e-314,8.4879831644e-314,2.121995791e-314,5.511268671e-314,8.487983164e-314,4.2439916896e-314,3.56e-322,2.121995791e-314,5.511268691e-314,8.487983164e-314,8.4879831644e-314,2.630694946e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,5.5112687106e-314,8.487983164e-314,2.77e-322,8.0e-323,4.243991582e-314,5.5112687303e-314,8.4879831644e-314,4.243991689e-314,1.69759663277e-313,4.243991582e-314,5.51126875e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(8.487983164e-314,8.0e-323,5.509959409e-314,4.243991582e-314,5.51126877e-314,8.487983164e-314,4.243991582e-314,5.5112689754e-314,4.243991582e-314,5.5112687896e-314,8.487983164e-314,5.5099595433e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.5112688015e-314,8.487983164e-314,5.5112688094e-314,8.4879831644e-314,4.24399169e-314,2.1837517714e-314,0.0,5.511268829e-314,8.4879831644e-314,4.2439916906e-314,8.4879831644e-314,4.243991582e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.511268849e-314,8.487983164e-314,1.07e-321,4.74e-322,4.604730866395e-312,5.5112688687e-314,8.487983164e-314,8.4879831644e-314,5.511173115e-314,0.0,5.5112688884e-314,8.4879831644e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(1.07e-321,7.29112201955641e-304,4.243991582e-314,5.511268908e-314,8.487983164e-314,5.5111732136e-314,5.5112688726e-314,0.0,5.511268928e-314,8.487983164e-314,4.24399159e-314,5.509960081e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(4.243991582e-314,5.5112689477e-314,8.487983164e-314,2.183674721e-314,0.0,4.243991582e-314,5.5112689675e-314,8.4879831644e-314,4.243991689e-314,5.5112689793e-314,8.487983164e-314,5.511268987e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(8.487983164e-314,0.0,-2.843474359893e-312,0.0,5.511269007e-314,8.487983164e-314,5.511269015e-314,8.4879831644e-314,4.243991582e-314,5.511269027e-314,8.487983164e-314,2.052268400649192e-289), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.93e-322,5.21501685587625e-310,5.5112690465e-314,8.487983164e-314,8.4879831644e-314,5.511173609e-314,4.243991582e-314,5.5112690663e-314,8.487983164e-314,5.14e-322,4.24399159e-314,4.243991582e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.511269086e-314,8.487983164e-314,5.5111737077e-314,1.06099789553e-313,6.3659873724e-314,5.511269106e-314,9.5e-322,4.77449052971e-312,5.509960887e-314,5.5112691216e-314,8.4879831644e-314,5.5111739053e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.5093952057e-314,2.1814026533e-314,0.0,1.383e-321,8.487983168e-314,5.5099610216e-314,5.511269157e-314,8.4879831644e-314,5.511174004e-314,8.487983164e-314,5.5093950397e-314,4.604896647316e-312), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1219957905e-314,4.24399159e-314,5.509961156e-314,5.511269193e-314,8.4879831644e-314,5.511174103e-314,5.5112692046e-314,8.487983164e-314,5.511269007e-314,1.225e-321,8.0e-323,5.5099612903e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.5112692283e-314,8.4879831644e-314,5.5111742017e-314,5.00791006668e-312,5.511269244e-314,8.487983164e-314,1.146e-321,3.989352087094e-312,6.3659873724e-314,5.511269264e-314,8.4879831644e-314,5.5111743005e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.511269007e-314,4.243991582e-314,5.5112692837e-314,1.067e-321,4.24399159e-314,5.509961559e-314,5.5112692995e-314,8.4879831644e-314,5.5111743994e-314,5.511104309e-314,5.511269169e-314,3.968132129105e-312), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(9.9e-322,8.487983172e-314,5.5099616935e-314,5.511269335e-314,1.06099789543e-313,5.511174498e-314,8.487983164e-314,1.0634846093e-313,5.511269007e-314,1.423e-321,4.243991586e-314,5.5099619623e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.5112693706e-314,8.4879831644e-314,5.509493778e-314,5.5112693825e-314,8.487983164e-314,4.2439938586e-314,0.0,4.24399159e-314,5.509962231e-314,5.511269406e-314,8.4879831644e-314,5.5094939754e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(4.2439915814e-314,5.511269422e-314,8.4879831644e-314,0.0,8.0e-323,5.5099625e-314,5.511269442e-314,8.4879831644e-314,5.509494173e-314,2.1219958226e-314,4.243991582e-314,5.5112694615e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,5.21501685587704e-310,5.5099627686e-314,5.5112694774e-314,8.4879831644e-314,5.5094943706e-314,5.21512295566807e-310,5.5093949725e-314,6.3659873724e-314,0.0,8.487983172e-314,5.5099630374e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.511269513e-314,8.4879831644e-314,5.5094945683e-314,8.487983164e-314,5.5086215503e-314,2.6307492654e-314,0.0,4.24399159e-314,5.5099631717e-314,5.5112695485e-314,8.4879831644e-314,5.509494667e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.5112695604e-314,8.487983164e-314,5.509395684e-314,1.86e-321,6.3659873724e-314,5.509963306e-314,5.511269584e-314,8.4879831644e-314,5.509494766e-314,4.243991582e-314,5.5112696e-314,8.487983164e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(1.897e-321,4.243991586e-314,5.5099637093e-314,5.5112696196e-314,8.4879831644e-314,5.5094950623e-314,4.77457342013e-312,4.243991582e-314,5.5112696394e-314,0.0,8.0e-323,5.509963978e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(6.3659873724e-314,8.4879831644e-314,5.50949526e-314,4.243991689e-314,2.1837897157e-314,4.243991582e-314,0.0,8.487983172e-314,5.509964247e-314,5.511269691e-314,8.4879831644e-314,5.5094954576e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(8.487983164e-314,5.511104246e-314,5.5112698054e-314,0.0,4.24399159e-314,5.509964381e-314,5.5112697264e-314,8.4879831644e-314,6.3659873724e-314,5.511269738e-314,8.487983164e-314,4.62595084707e-312), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,2.121995799e-314,5.50996465e-314,5.511269762e-314,8.4879831644e-314,5.5096394836e-314,4.243991582e-314,5.5112697777e-314,8.487983164e-314,0.0,5.21501685587704e-310,5.5099647844e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.5112697975e-314,8.4879831644e-314,5.5096395825e-314,5.511270351e-314,4.795710487577e-312,5.5112698173e-314,2.33e-321,4.243991586e-314,5.509964919e-314,5.511269833e-314,8.4879831644e-314,5.5096396813e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.630816897e-314,5.5112699438e-314,4.243991582e-314,2.37e-321,8.487983168e-314,5.5099651875e-314,5.5112698687e-314,8.4879831644e-314,5.509639879e-314,8.487983164e-314,5.5110941315e-314,5.2150192075647e-310), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1219957905e-314,4.24399159e-314,5.5099654563e-314,5.511269904e-314,8.4879831644e-314,5.5096400765e-314,5.511269916e-314,8.487983164e-314,5.511269987e-314,0.0,4.24399159e-314,5.509965725e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.51126994e-314,8.4879831644e-314,5.509640274e-314,4.243991582e-314,5.5112699556e-314,8.4879831644e-314,0.0,2.121995799e-314,6.3659873724e-314,5.5112699754e-314,8.4879831644e-314,5.509640472e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.2150192075647e-310,4.243991582e-314,5.511269995e-314,0.0,4.24399159e-314,5.5099662626e-314,5.511270011e-314,8.4879831644e-314,5.5096406694e-314,5.3049895145e-314,2.1219957915e-314,4.243991582e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,8.487983172e-314,5.509966397e-314,5.5112700465e-314,1.06099789543e-313,5.509640768e-314,5.5112699517e-314,5.511269588e-314,2.18374921e-314,2.806e-321,4.243991586e-314,5.5099665314e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.511270082e-314,8.4879831644e-314,5.509640867e-314,4.775236543817e-312,5.5112700505e-314,5.5093954547e-314,2.846e-321,5.071569940447e-312,5.5099669345e-314,5.5112701177e-314,8.4879831644e-314,5.5096411635e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(6.3659873724e-314,NaN,1.69759663277e-313,3.28e-321,4.243991586e-314,5.5099672033e-314,5.511270153e-314,8.4879831644e-314,5.509641361e-314,2.1837517714e-314,5.51019711e-314,2.630694697e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,8.0e-323,5.509967472e-314,5.511270189e-314,8.4879831644e-314,5.5101457586e-314,0.0,0.0,-2.949574149437e-312,0.0,8.0e-323,5.5099676065e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.5112702244e-314,8.4879831644e-314,5.5101458575e-314,5.5101975367e-314,5.5112696236e-314,7.2911220195564e-304,0.0,5.21501685587704e-310,5.509967875e-314,5.51127026e-314,8.4879831644e-314,5.510146055e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.631021223e-314,4.64717596287e-312,5.5081700415e-314,0.0,6.3659873724e-314,5.5099680096e-314,5.5112702955e-314,8.4879831644e-314,5.510146154e-314,4.64717078222e-312,5.5081700415e-314,8.0e-323), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,4.24399159e-314,5.509968144e-314,5.511270331e-314,8.4879831644e-314,5.5101462527e-314,4.778309726736485e-299,0.0,5.2150192078493e-310,3.32e-321,4.77449052971e-312,5.509968413e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(6.3659873724e-314,8.4879831644e-314,5.5101464503e-314,5.509917283e-314,5.5110926295e-314,5.5093954705e-314,3.755e-321,4.243991586e-314,5.5099686816e-314,5.511270402e-314,8.4879831644e-314,5.510146648e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(7.4e-323,0.0,2.630749064e-314,0.0,2.121995799e-314,5.5099689503e-314,5.511270438e-314,8.4879831644e-314,6.3659873724e-314,5.2150192075647e-310,4.243991582e-314,2.121995799e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,NaN,5.509969219e-314,5.5112704734e-314,8.4879831644e-314,5.510147043e-314,0.0,5.511270501e-314,4.778309726736484e-299,0.0,5.21501685587704e-310,5.509969488e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.511270509e-314,8.4879831644e-314,5.510147241e-314,5.5112589597e-314,4.816930445486e-312,5.5081707055e-314,0.0,4.24399159e-314,5.5099696223e-314,5.5112705445e-314,8.4879831644e-314,5.5101473396e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(NaN,2.1219958226e-314,5.5081700217e-314,0.0,8.0e-323,5.5099697566e-314,5.51127058e-314,8.4879831644e-314,5.5101474385e-314,5.215019589764e-310,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1219957905e-314,4.0e-323,5.50997016e-314,5.5112706157e-314,8.4879831644e-314,5.509654254e-314,2.122092929e-314,5.215019589764e-310,0.0,4.23e-321,4.0e-323,5.5099704286e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.5112706513e-314,8.4879831644e-314,5.509654452e-314,4.0e-323,2.1469600674e-314,5.215019589764e-310,0.0,8.0e-323,6.3659873724e-314,5.511270687e-314,8.4879831644e-314,5.5096546495e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.12199579097e-313,4.0e-323,NaN,0.0,8.0e-323,5.509970966e-314,5.5112707224e-314,8.4879831644e-314,5.509654847e-314,3.16e-322,2.12199579097e-313,4.0e-323), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,5.21501685587704e-310,5.5099711005e-314,5.511270758e-314,1.06099789543e-313,5.509654946e-314,5.21501842472993e-310,3.56e-322,2.12199579097e-313,0.0,2.121995799e-314,5.509971235e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.5112707935e-314,8.4879831644e-314,5.5096550447e-314,0.0,5.21501842472993e-310,3.95e-322,0.0,8.0e-323,5.5099713693e-314,5.511270829e-314,8.4879831644e-314,5.5096551435e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1219957905e-314,0.0,5.21501842472993e-310,4.27e-321,2.12199579136e-313,5.509971638e-314,5.5112708647e-314,8.4879831644e-314,5.509655341e-314,0.0,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(4.704e-321,4.0e-323,5.509971907e-314,5.5112709003e-314,8.4879831644e-314,5.509655539e-314,0.0,0.0,2.1219957905e-314,0.0,5.21501685587704e-310,5.5099721756e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.511270936e-314,8.4879831644e-314,5.5096557364e-314,5.215019589764e-310,0.0,0.0,0.0,8.0e-323,5.5099724444e-314,5.5112709714e-314,8.4879831644e-314,5.509655934e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.122092929e-314,5.215019589764e-310,0.0,0.0,2.1219957905e-314,5.509972713e-314,5.511271007e-314,8.4879831644e-314,2.630655061e-314,4.0e-323,2.1469600674e-314,5.215019589764e-310), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,8.0e-323,5.5099728475e-314,5.5112710426e-314,8.4879831644e-314,2.6306551597e-314,2.12199579097e-313,4.0e-323,NaN,0.0,8.0e-323,5.509972982e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(6.3659873724e-314,8.4879831644e-314,2.6306552585e-314,6.7e-322,2.12199579097e-313,4.0e-323,4.743e-321,5.21501685587665e-310,5.5099731163e-314,5.5112711137e-314,8.4879831644e-314,2.6306553573e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.21501842472993e-310,7.1e-322,2.12199579097e-313,4.783e-321,2.121995795e-314,5.215019589764e-310,0.0,0.0,2.1219957905e-314,0.0,5.21501842472993e-310,7.5e-322), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.12199579097e-313,4.0e-323,2.122092929e-314,5.215019589764e-310,0.0,0.0,0.0,0.0,5.21501842472993e-310,7.9e-322,2.12199579097e-313,4.0e-323), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1469600674e-314,5.215019589764e-310,0.0,0.0,2.1219957905e-314,0.0,5.21501842472993e-310,8.3e-322,2.12199579097e-313,4.0e-323,NaN,5.215019589764e-310), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,0.0,0.0,0.0,5.21501842472993e-310,8.7e-322,2.12199579097e-313,4.0e-323,2.122092929e-314,5.215019589764e-310,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1219957905e-314,0.0,5.21501842472993e-310,9.1e-322,2.12199579097e-313,4.0e-323,2.1469600674e-314,5.215019589764e-310,0.0,0.0,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.21501842472993e-310,9.5e-322,2.12199579097e-313,4.0e-323,NaN,5.215019589764e-310,0.0,0.0,2.1219957905e-314,0.0,5.21501842472993e-310,9.9e-322), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.12199579097e-313,4.0e-323,3.38202793e-314,5.215019589764e-310,0.0,0.0,0.0,0.0,5.21501842472993e-310,1.03e-321,2.12199579097e-313,4.0e-323), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(3.38202793e-314,5.215019589764e-310,0.0,0.0,2.1219957905e-314,0.0,5.21501842472993e-310,1.067e-321,2.12199579097e-313,4.0e-323,3.38202793e-314,5.215019589764e-310), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,0.0,0.0,0.0,5.21501842472993e-310,1.107e-321,2.12199579097e-313,4.0e-323,NaN,5.215019589764e-310,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1219957905e-314,0.0,5.21501842472993e-310,1.146e-321,2.12199579097e-313,4.0e-323,2.1469600674e-314,5.215019589764e-310,0.0,0.0,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.21501842472993e-310,1.186e-321,2.12199579097e-313,4.0e-323,3.38202793e-314,5.215019589764e-310,0.0,0.0,2.1219957905e-314,0.0,5.21501842472993e-310,1.225e-321), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.12199579097e-313,4.0e-323,3.38202793e-314,5.215019589764e-310,0.0,0.0,0.0,0.0,5.21501842472993e-310,1.265e-321,2.12199579097e-313,5.0e-324), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1469600674e-314,5.215019589764e-310,0.0,0.0,2.1219957905e-314,0.0,5.21501842472993e-310,1.27e-321,2.12199579097e-313,5.0e-324,NaN,5.215019589764e-310), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,0.0,0.0,0.0,5.21501842472993e-310,1.304e-321,2.12199579097e-313,4.0e-323,3.38202793e-314,5.215019589764e-310,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1219957905e-314,0.0,5.21501842506195e-310,0.0,2.33419537006e-313,4.0e-323,3.38202793e-314,5.215019589764e-310,0.0,0.0,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.21501842725323e-310,0.0,0.0,4.0e-323,2.1220929287e-314,5.21501958973554e-310,0.0,0.0,2.1219957905e-314,0.0,5.2150175747054e-310,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,4.0e-323,NaN,0.0,0.0,0.0,0.0,0.0,5.2150172947441e-310,0.0,2.12199579097e-313,4.0e-323), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.453530154163e-312,0.0,0.0,0.0,2.1219957905e-314,0.0,5.21501842316315e-310,0.0,3.062247015e-315,8.0e-323,2.1220929366e-314,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,0.0,0.0,0.0,5.2150184231157e-310,0.0,2.2142524967e-313,8.0e-323,3.38202793e-314,0.0,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1219957905e-314,0.0,5.21501842316315e-310,8.0e-323,3.062247015e-315,4.0e-323,3.3820279375e-314,0.0,0.0,0.0,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.2150184231157e-310,8.0e-323,2.2142524967e-313,4.0e-323,2.122092929e-314,0.0,0.0,0.0,2.1219957905e-314,0.0,5.2150184237323e-310,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.46328909176e-313,4.0e-323,3.38202793e-314,5.215019589764e-310,0.0,0.0,0.0,0.0,5.2150172947014e-310,0.0,2.2510547178e-313,4.0e-323), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.45377882554e-312,0.0,0.0,0.0,2.1219957905e-314,0.0,5.2150175751054e-310,0.0,0.0,1.0e-323,3.3820279375e-314,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,0.0,0.0,0.0,5.21501842472993e-310,1.265e-321,2.12199579097e-313,1.0e-323,3.38202793e-314,0.0,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1219957905e-314,0.0,5.2150175750793e-310,0.0,0.0,8.0e-323,2.146960075e-314,0.0,0.0,0.0,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.21501842472993e-310,1.146e-321,2.12199579097e-313,8.0e-323,2.122092929e-314,0.0,0.0,0.0,2.1219957905e-314,0.0,5.21501757505323e-310,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,8.0e-323,2.1220929366e-314,0.0,0.0,0.0,0.0,0.0,5.21501842472993e-310,1.03e-321,2.12199579097e-313,8.0e-323), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1469600674e-314,0.0,0.0,0.0,2.1219957905e-314,0.0,5.21501757502714e-310,0.0,0.0,8.0e-323,3.3820279375e-314,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(0.0,0.0,0.0,0.0,5.21501842472993e-310,9.1e-322,2.12199579097e-313,8.0e-323,3.38202793e-314,0.0,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.1219957905e-314,0.0,5.21501757500105e-310,0.0,0.0,8.0e-323,3.3820279375e-314,0.0,0.0,0.0,0.0,0.0), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.21501842472993e-310,7.9e-322,2.12199579097e-313,8.0e-323,2.122092929e-314,0.0,0.0,0.0,2.1219957905e-314,0.0,2.1219958226e-314,NaN), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.630777771e-314,NaN,5.508170338e-314,4.2439915854e-314,2.6308362763e-314,NaN,2.630838363e-314,4.243991613e-314,NaN,NaN,2.630835126e-314,4.2439916047e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.6308352605e-314,4.2439916056e-314,NaN,4.243991597e-314,NaN,NaN,5.5101967027e-314,2.121995802e-314,2.630836043e-314,NaN,NaN,NaN), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.630776783e-314,NaN,2.6308351933e-314,4.243991605e-314,2.6308361775e-314,NaN,2.630839383e-314,4.243991618e-314,5.510196505e-314,4.243991592e-314,NaN,2.6307461903e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(4.2439915814e-314,4.2439916175e-314,NaN,4.2439915913e-314,2.63077787e-314,2.121995836e-314,2.630838197e-314,NaN,2.630835395e-314,NaN,2.630836375e-314,4.2439916106e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.6307768817e-314,2.121995831e-314,2.630838296e-314,2.1219958216e-314,NaN,4.243991607e-314,NaN,NaN,6.3659873724e-314,4.2439915923e-314,5.508170437e-314,4.243991586e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.6308353277e-314,4.243991606e-314,2.6308379996e-314,NaN,2.630777937e-314,4.2439916274e-314,2.630839648e-314,2.1219958285e-314,NaN,NaN,5.510197916e-314,4.243991598e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(NaN,5.5112593233e-314,2.630834395e-314,NaN,4.2439915814e-314,2.1219958167e-314,5.5111740397e-314,4.243991583e-314,5.5081706344e-314,NaN,2.6308385964e-314,4.2439916145e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.5101969004e-314,2.121995803e-314,5.5081705356e-314,NaN,2.6308386636e-314,2.121995824e-314,5.5101978806e-314,4.2439915977e-314,2.630834462e-314,NaN,2.630835462e-314,NaN), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(6.3659873724e-314,NaN,NaN,NaN,5.508170733e-314,NaN,NaN,2.1219958053e-314,5.511174174e-314,4.243991582e-314,5.5101968015e-314,NaN), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.630839549e-314,2.121995828e-314,2.6308345293e-314,4.243991601e-314,2.630834628e-314,4.2439916017e-314,2.630777704e-314,NaN,4.2439915814e-314,4.2439916136e-314,2.630839482e-314,2.1219958275e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(5.511174107e-314,2.1219957915e-314,2.630777668e-314,4.2439916254e-314,2.630838529e-314,NaN,NaN,5.511105914e-314,2.6307772493e-314,4.2439916244e-314,NaN,NaN), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(NaN,2.1219958043e-314,NaN,4.2439915987e-314,4.2439915814e-314,2.121995811e-314,5.5101971336e-314,2.1219958043e-314,NaN,5.5110908153e-314,NaN,4.243991584e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(2.6307773165e-314,2.121995834e-314,5.5081708004e-314,4.243991588e-314,NaN,4.243991592e-314,NaN,2.1219958004e-314,2.630834731e-314,4.2439916027e-314,2.630839881e-314,4.243991621e-314), Dual{ForwardDiff.Tag{OptimizationProblems.ADNLPProblems.var"#f#382"{Int64}, Float64}}(6.3659873724e-314,4.243991596e-314,5.510198181e-314,NaN,NaN,4.2439915923e-314,5.510198082e-314,2.1219958083e-314,5.5081709664e-314,4.243991589e-314,2.630838731e-314,2.1219958246e-314)])), ADNLPModels.ForwardDiffADHvprod(), ADNLPModels.ForwardDiffADJprod(), ADNLPModels.ForwardDiffADJtprod(), ADNLPModels.ForwardDiffADJacobian(6664), ADNLPModels.ForwardDiffADHessian(4851), ADNLPModels.ForwardDiffADGHjvprod())
  Problem name: hovercraft1d
   All variables: ████████████████████ 98     All constraints: ████████████████████ 68
            free: ████████████████████ 98                free: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           lower: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                lower: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           upper: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                upper: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
         low/upp: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0              low/upp: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           fixed: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                fixed: ████████████████████ 68
          infeas: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0               infeas: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
            nnzh: (  0.00% sparsity)   4851            linear: ████████████████████ 68
                                                    nonlinear: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
                                                         nnzj: (  0.00% sparsity)   6664

  Counters:
             obj: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                 grad: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                 cons: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
        cons_lin: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0             cons_nln: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                 jcon: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           jgrad: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                  jac: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0              jac_lin: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
         jac_nln: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                jprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0            jprod_lin: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
       jprod_nln: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0               jtprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0           jtprod_lin: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
      jtprod_nln: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                 hess: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                hprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           jhess: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0               jhprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0

In a terminal window, the very long line results in many lines of gibberish.

Use SparseDiffTools with ADNLPModel?

Hi,
This looks like a nice and useful package :) Thanks for the work. I was wondering if it would be easy to use SparseDiffTools.jl in this package for e.g. Hessian calculations?
If it would be easy to implement could you point me in the right direction.

WARNING: method definition ... declares type variable AD but does not use it

With Julia 1.9 and ADNLPModels 0.6.2, I'm getting

┌ ADNLPModels [54578032-b7ea-4c30-94aa-7cbd1cce6c9a]
│  WARNING: method definition for #ADNLSModel!#251 at /home/runner/.julia/packages/ADNLPModels/gdqYX/src/nls.jl:149 declares type variable AD but does not use it.
│  WARNING: method definition for #ADNLSModel#287 at /home/runner/.julia/packages/ADNLPModels/gdqYX/src/nls.jl:570 declares type variable AD but does not use it.
│  WARNING: method definition for #ADNLSModel!#290 at /home/runner/.julia/packages/ADNLPModels/gdqYX/src/nls.jl:600 declares type variable AD but does not use it.

https://github.com/JuliaSmoothOptimizers/RegularizedProblems.jl/actions/runs/5180679050/jobs/9335162380?pr=41#step:6:680

Inaccurate display

julia> nls_opt = ADNLSModel(resid, ones(5), 202, jtprod_backend = ADNLPModels.ZygoteADJtprod)                                                                                                                            
ADNLSModel - Nonlinear least-squares model with automatic differentiation backend ADModelBackend{                                                                                                                        
  ForwardDiffADGradient,                                                                                                                                                                                                 
  ForwardDiffADHvprod,                                                                                                                                                                                                   
  EmptyADbackend,
  EmptyADbackend,
  EmptyADbackend,
  ForwardDiffADHessian,
  EmptyADbackend,
}

The non-default AD backend for jtprod is not reflected in the output.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.