ADNLPModelProblems.jl
A list of optimization problems in ADNLPModel format
Install / Use
/learn @JSO-Boneyard/ADNLPModelProblems.jlREADME
ADNLPModelProblems.jl
| Documentation | CI | Coverage | Release | DOI |
|:-----------------:|:------:|:------------:|:-----------:|:-------:|
|
|
|
|
| [![doi][doi-img]][doi-url] |
A list of optimization problems in ADNLPModel format, see ADNLPModels.jl. Most of the problems given here are also available in JuMP format either in OptimizationProblems.jl or within this package.
The list of available problems are given in String format in
ADNLPModelProblems.problems
This list includes unconstrained and constrained problems, and several are scalable problems. The default parameter value for the variable scale is:
ADNLPModelProblems.default_nvar
The list of problems for which there is no JuMP model is obtained by:
ADNLPModelProblems.problems_no_jump
There are several ways to access the problems, for instance "power":
ADNLPModelProblems.power_forward() # ForwardDiff backend
ADNLPModelProblems.power_reverse() # ReverseDiff backend
ADNLPModelProblems.power_zygote() # Zygote backend
ADNLPModelProblems.power_jump() # NLPModelJuMP model
and using ADNLPModels.jl default backend
ADNLPModelProblems.power_autodiff()
Finally, the properties of each problem can be accessed via
$(nameoftheproblem)_meta: Dict that contains main information. This information is summed up for the whole test set in the variablesADNLPModelProblems.meta.get_$(nameoftheproblem)_meta(n): returns the number of variables and constraints of the problem parameterized byn. If the problem is scalable this varies from$(nameoftheproblem)_meta[:nvar]and$(nameoftheproblem)_meta[:ncon]that were generated withn = ADNLPModelProblems.default_nvar.
How to contribute?
Contributions with new problems are very welcome. A couple of advice for a successful contribution:
- check that the new problem is not already on the list.
- furnish as much detail as possible regarding the origin of the problem, e.g. citation, link, application ...
- the problem should be type-stable, i.e. argument
type::Val{T} = Val(Float64)should induce the type returned by theADNLPModel. - fill-in the
metaas precisely as possible. The functiongenerate_metahelps for this step.
