Ipopt.jl
A Julia interface to the Ipopt nonlinear solver
Install / Use
/learn @jump-dev/Ipopt.jlREADME

Ipopt.jl
Ipopt.jl is a wrapper for the Ipopt solver.
Affiliation
This wrapper is maintained by the JuMP community and is not a COIN-OR project.
Getting help
If you need help, please ask a question on the JuMP community forum.
If you have a reproducible example of a bug, please open a GitHub issue.
License
Ipopt.jl is licensed under the MIT License.
The underlying solver, coin-or/Ipopt, is licensed under the Eclipse public license.
Installation
Install Ipopt.jl using the Julia package manager:
import Pkg
Pkg.add("Ipopt")
In addition to installing the Ipopt.jl package, this will also download and
install the Ipopt binaries. You do not need to install Ipopt separately.
To use a custom binary, read the Custom solver binaries section of the JuMP documentation.
For details on using a different linear solver, see the Linear Solvers section
below. You do not need a custom binary to change the linear solver.
Use with JuMP
You can use Ipopt with JuMP as follows:
using JuMP, Ipopt
model = Model(Ipopt.Optimizer)
set_attribute(model, "max_cpu_time", 60.0)
set_attribute(model, "print_level", 0)
Type stability
Ipopt.jl v1.10.0 moved the Ipopt.Optimizer object to a package extension. As a
consequence, Ipopt.Optimizer is now type unstable, and it will be inferred as
Ipopt.Optimizer()::Any.
In most cases, this should not impact performance. If it does, there are two work-arounds.
First, you can use a function barrier:
using JuMP, Ipopt
function main(optimizer::T) where {T}
model = Model(optimizer)
return
end
main(Ipopt.Optimizer)
Although the outer Ipopt.Optimizer is type unstable, the optimizer inside
main will be properly inferred.
Second, you may explicitly get and use the extension module:
using JuMP, Ipopt
const IpoptMathOptInterfaceExt =
Base.get_extension(Ipopt, :IpoptMathOptInterfaceExt)
model = Model(IpoptMathOptInterfaceExt.Optimizer)
MathOptInterface API
The Ipopt optimizer supports the following constraints and attributes.
List of supported objective functions:
MOI.ObjectiveFunction{MOI.ScalarAffineFunction{Float64}}MOI.ObjectiveFunction{MOI.ScalarNonlinearFunction}MOI.ObjectiveFunction{MOI.ScalarQuadraticFunction{Float64}}MOI.ObjectiveFunction{MOI.VariableIndex}
List of supported variable types:
List of supported constraint types:
MOI.ScalarAffineFunction{Float64}inMOI.EqualTo{Float64}MOI.ScalarAffineFunction{Float64}inMOI.GreaterThan{Float64}MOI.ScalarAffineFunction{Float64}inMOI.Interval{Float64}MOI.ScalarAffineFunction{Float64}inMOI.LessThan{Float64}MOI.ScalarNonlinearFunctioninMOI.EqualTo{Float64}MOI.ScalarNonlinearFunctioninMOI.GreaterThan{Float64}MOI.ScalarNonlinearFunctioninMOI.Interval{Float64}MOI.ScalarNonlinearFunctioninMOI.LessThan{Float64}MOI.ScalarQuadraticFunction{Float64}inMOI.EqualTo{Float64}MOI.ScalarQuadraticFunction{Float64}inMOI.GreaterThan{Float64}MOI.ScalarQuadraticFunction{Float64}inMOI.Interval{Float64}MOI.ScalarQuadraticFunction{Float64}inMOI.LessThan{Float64}MOI.VariableIndexinMOI.EqualTo{Float64}MOI.VariableIndexinMOI.GreaterThan{Float64}MOI.VariableIndexinMOI.Interval{Float64}MOI.VariableIndexinMOI.LessThan{Float64}MOI.VectorOfVariablesinMOI.VectorNonlinearOracle{Float64}
List of supported model attributes:
MOI.BarrierIterationsMOI.NLPBlockMOI.NLPBlockDualStartMOI.NameMOI.ObjectiveSenseMOI.SolveTimeSec
List of supported optimizer attributes:
Options
Supported options are listed in the Ipopt documentation.
Solver-specific callbacks
Ipopt provides a callback that can be used to log the status of the optimization
during a solve. It can also be used to terminate the optimization by returning
false. Here is an example:
using JuMP, Ipopt, Test
model = Model(Ipopt.Optimizer)
set_silent(model)
@variable(model, x >= 1)
@objective(model, Min, x + 0.5)
x_vals = Float64[]
function my_callback(
alg_mod::Cint,
iter_count::Cint,
obj_value::Float64,
inf_pr::Float64,
inf_du::Float64,
mu::Float64,
d_norm::Float64,
regularization_size::Float64,
alpha_du::Float64,
alpha_pr::Float64,
ls_trials::Cint,
)
push!(x_vals, callback_value(model, x))
@test isapprox(obj_value, 1.0 * x_vals[end] + 0.5, atol = 1e-1)
# return `true` to keep going, or `false` to terminate the optimization.
return iter_count < 1
end
MOI.set(model, Ipopt.CallbackFunction(), my_callback)
optimize!(model)
@test MOI.get(model, MOI.TerminationStatus()) == MOI.INTERRUPTED
@test length(x_vals) == 2
See the Ipopt documentation for an explanation of the arguments to the callback. They are identical to the output contained in the logging table printed to the screen.
To access the current solution and primal, dual, and complementarity violations
of each iteration, use Ipopt.GetIpoptCurrentViolations and
Ipopt.GetIpoptCurrentIterate. The two functions are identical to the ones in
the Ipopt C interface.
C API
Ipopt.jl wraps the Ipopt C interface with minimal modifications.
A complete example is available in the test/C_wrapper.jl file.
For simplicity, the five callbacks required by Ipopt are slightly different to the C interface. They are as follows:
"""
eval_f(x::Vector{Float64})::Float64
Returns the objective value `f(x)`.
"""
function eval_f end
"""
eval_grad_f(x::Vector{Float64}, grad_f::Vector{Float64})::Nothing
Fills `grad_f` in-place with the gradient of the objective function evaluated at
`x`.
"""
function eval_grad_f end
"""
eval_g(x::Vector{Float64}, g::Vector{Float64})::Nothing
Fills `g` in-place with the value of the constraints evaluated at `x`.
"""
function eval_g end
"""
eval_jac_g(
x::Vector{Float64},
rows::Vector{Cint},
cols::Vector{Cint},
values::Union{Nothing,Vector{Float64}},
)::Nothing
Compute the Jacobian matrix.
* If `values === nothing`
- Fill `rows` and `cols` with the 1-indexed sparsity structure
* Otherwise:
- Fill `values` with the elements of the Jacobian matrix according to the
sparsity structure.
!!! warning
If `values === nothing`, `x` is an undefined object. Accessing any elements
in it will cause Julia to segfault.
"""
function eval_jac_g end
"""
eval_h(
x::Vector{Float64},
rows::Vector{Cint},
cols::Vector{Cint},
obj_factor::Float64,
lambda::Float64,
values::Union{Nothing,Vector{Float64}},
)::Nothing
Compute the Hessian-of-the-Lagrangian matrix.
* If `values === nothing`
- Fill `rows` and `cols` with the 1-indexed sparsity structure
* Otherwise:
- Fill `values` with the Hessian matrix according to the sparsity structure.
!!! warning
If `values === nothing`, `x` is an undefined object. Accessing any elements
in it will cause Julia to segfault.
"""
function eval_h end
INVALID_MODEL error
If you get a termination status MOI.INVALID_MODEL, it is probably because you
have some undefined value in your model, for example, a division by zero. Fix
this by removing the division, or by imposing variable bounds so that you cut
off the undefined region.
Instead of
model = Model(Ipopt.Optimizer)
@variable(model, x)
@NLobjective(model, 1 / x)
do
model = Model(Ipopt.Optimizer)
@variable(model, x >= 0.0001)
@NLobjective(model, 1 / x)
Linear Solvers
To improve performance, Ipopt supports a number of linear solvers.
HSL
Obtain a license and download HSL_jll.jl from
https://licences.stfc.ac.uk/products/Software/HSL/LibHSL.
Install this download into your current environment using:
import Pkg
Pkg.develop(path = "/full/path/to/HSL_jll.jl")
Then, use a linear solver in HSL by setting the hsllib and linear_solver
attributes:
using JuMP, Ipopt
import HSL_jll
model = Model(Ipopt.Optimizer)
set_attribute(model, "hsllib", HSL_jll.libhsl_path)
set_attribute(model, "linear_solver", "ma86")
The available HSL solvers are "ma27", "ma57", "ma77", "ma86", and "ma97".
We recommend using either sequential BLAS and LAPACK backends or a multithreaded version
limited to one thread when employing the linear solvers "ma86", or "ma97".
These solvers already leverage parallelism via OpenMP, and enabling multiple threads in
BLAS and LAP
Related Skills
node-connect
345.9kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
106.4kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
345.9kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
345.9kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
