Reference
Contents
Index
ADNLPModels.ADModelBackend
ADNLPModels.ADNLPModel
ADNLPModels.ADNLSModel
ADNLPModels.compute_hessian_sparsity
ADNLPModels.compute_jacobian_sparsity
ADNLPModels.get_F
ADNLPModels.get_adbackend
ADNLPModels.get_c
ADNLPModels.get_default_backend
ADNLPModels.get_lag
ADNLPModels.get_nln_nnzh
ADNLPModels.get_nln_nnzj
ADNLPModels.get_residual_nnzh
ADNLPModels.get_residual_nnzj
ADNLPModels.set_adbackend!
ADNLPModels.ADModelBackend
— TypeADModelBackend(gradient_backend, hprod_backend, jprod_backend, jtprod_backend, jacobian_backend, hessian_backend, ghjvprod_backend, hprod_residual_backend, jprod_residual_backend, jtprod_residual_backend, jacobian_residual_backend, hessian_residual_backend)
Structure that define the different backend used to compute automatic differentiation of an ADNLPModel
/ADNLSModel
model. The different backend are all subtype of ADBackend
and are respectively used for:
- gradient computation;
- hessian-vector products;
- jacobian-vector products;
- transpose jacobian-vector products;
- jacobian computation;
- hessian computation;
- directional second derivative computation, i.e. gᵀ ∇²cᵢ(x) v.
The default constructors are ADModelBackend(nvar, f, ncon = 0, c = (args...) -> []; showtime::Bool = false, kwargs...) ADModelNLSBackend(nvar, F!, nequ, ncon = 0, c = (args...) -> []; showtime::Bool = false, kwargs...)
If show_time
is set to true
, it prints the time used to generate each backend.
The remaining kwargs
are either the different backends as listed below or arguments passed to the backend's constructors:
gradient_backend = ForwardDiffADGradient
;hprod_backend = ForwardDiffADHvprod
;jprod_backend = ForwardDiffADJprod
;jtprod_backend = ForwardDiffADJtprod
;jacobian_backend = SparseADJacobian
;hessian_backend = ForwardDiffADHessian
;ghjvprod_backend = ForwardDiffADGHjvprod
;hprod_residual_backend = ForwardDiffADHvprod
forADNLSModel
andEmptyADbackend
otherwise;jprod_residual_backend = ForwardDiffADJprod
forADNLSModel
andEmptyADbackend
otherwise;jtprod_residual_backend = ForwardDiffADJtprod
forADNLSModel
andEmptyADbackend
otherwise;jacobian_residual_backend = SparseADJacobian
forADNLSModel
andEmptyADbackend
otherwise;hessian_residual_backend = ForwardDiffADHessian
forADNLSModel
andEmptyADbackend
otherwise.
ADNLPModels.ADNLPModel
— MethodADNLPModel(f, x0)
ADNLPModel(f, x0, lvar, uvar)
ADNLPModel(f, x0, clinrows, clincols, clinvals, lcon, ucon)
ADNLPModel(f, x0, A, lcon, ucon)
ADNLPModel(f, x0, c, lcon, ucon)
ADNLPModel(f, x0, clinrows, clincols, clinvals, c, lcon, ucon)
ADNLPModel(f, x0, A, c, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, clinrows, clincols, clinvals, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, A, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, c, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, clinrows, clincols, clinvals, c, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, A, c, lcon, ucon)
ADNLPModel(model::AbstractNLPModel)
ADNLPModel is an AbstractNLPModel using automatic differentiation to compute the derivatives. The problem is defined as
min f(x)
s.to lcon ≤ ( Ax ) ≤ ucon
( c(x) )
lvar ≤ x ≤ uvar.
The following keyword arguments are available to all constructors:
minimize
: A boolean indicating whether this is a minimization problem (default: true)name
: The name of the model (default: "Generic")
The following keyword arguments are available to the constructors for constrained problems:
y0
: An inital estimate to the Lagrangian multipliers (default: zeros)
ADNLPModel
uses ForwardDiff
and ReverseDiff
for the automatic differentiation. One can specify a new backend with the keyword arguments backend::ADNLPModels.ADBackend
. There are three pre-coded backends:
- the default
ForwardDiffAD
. ReverseDiffAD
.ZygoteDiffAD
accessible after loadingZygote.jl
in your environment.
For an advanced usage, one can define its own backend and redefine the API as done in ADNLPModels.jl/src/forward.jl.
Examples
using ADNLPModels
f(x) = sum(x)
x0 = ones(3)
nvar = 3
ADNLPModel(f, x0) # uses the default ForwardDiffAD backend.
ADNLPModel(f, x0; backend = ADNLPModels.ReverseDiffAD) # uses ReverseDiffAD backend.
using Zygote
ADNLPModel(f, x0; backend = ADNLPModels.ZygoteAD)
using ADNLPModels
f(x) = sum(x)
x0 = ones(3)
c(x) = [1x[1] + x[2]; x[2]]
nvar, ncon = 3, 2
ADNLPModel(f, x0, c, zeros(ncon), zeros(ncon)) # uses the default ForwardDiffAD backend.
ADNLPModel(f, x0, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ReverseDiffAD) # uses ReverseDiffAD backend.
using Zygote
ADNLPModel(f, x0, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ZygoteAD)
For in-place constraints function, use one of the following constructors:
ADNLPModel!(f, x0, c!, lcon, ucon)
ADNLPModel!(f, x0, clinrows, clincols, clinvals, c!, lcon, ucon)
ADNLPModel!(f, x0, A, c!, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, c!, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, clinrows, clincols, clinvals, c!, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, A, c!, lcon, ucon)
ADNLSModel!(model::AbstractNLSModel)
where the constraint function has the signature c!(output, input)
.
using ADNLPModels
f(x) = sum(x)
x0 = ones(3)
function c!(output, x)
output[1] = 1x[1] + x[2]
output[2] = x[2]
end
nvar, ncon = 3, 2
nlp = ADNLPModel!(f, x0, c!, zeros(ncon), zeros(ncon)) # uses the default ForwardDiffAD backend.
ADNLPModels.ADNLSModel
— MethodADNLSModel(F, x0, nequ)
ADNLSModel(F, x0, nequ, lvar, uvar)
ADNLSModel(F, x0, nequ, clinrows, clincols, clinvals, lcon, ucon)
ADNLSModel(F, x0, nequ, A, lcon, ucon)
ADNLSModel(F, x0, nequ, c, lcon, ucon)
ADNLSModel(F, x0, nequ, clinrows, clincols, clinvals, c, lcon, ucon)
ADNLSModel(F, x0, nequ, A, c, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, clinrows, clincols, clinvals, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, A, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, c, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, clinrows, clincols, clinvals, c, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, A, c, lcon, ucon)
ADNLSModel(model::AbstractNLSModel)
ADNLSModel is an Nonlinear Least Squares model using automatic differentiation to compute the derivatives. The problem is defined as
min ½‖F(x)‖²
s.to lcon ≤ ( Ax ) ≤ ucon
( c(x) )
lvar ≤ x ≤ uvar
where nequ
is the size of the vector F(x)
and the linear constraints come first.
The following keyword arguments are available to all constructors:
linequ
: An array of indexes of the linear equations (default:Int[]
)minimize
: A boolean indicating whether this is a minimization problem (default: true)name
: The name of the model (default: "Generic")
The following keyword arguments are available to the constructors for constrained problems:
y0
: An inital estimate to the Lagrangian multipliers (default: zeros)
ADNLSModel
uses ForwardDiff
and ReverseDiff
for the automatic differentiation. One can specify a new backend with the keyword arguments backend::ADNLPModels.ADBackend
. There are three pre-coded backends:
- the default
ForwardDiffAD
. ReverseDiffAD
.ZygoteDiffAD
accessible after loadingZygote.jl
in your environment.
For an advanced usage, one can define its own backend and redefine the API as done in ADNLPModels.jl/src/forward.jl.
Examples
using ADNLPModels
F(x) = [x[2]; x[1]]
nequ = 2
x0 = ones(3)
nvar = 3
ADNLSModel(F, x0, nequ) # uses the default ForwardDiffAD backend.
ADNLSModel(F, x0, nequ; backend = ADNLPModels.ReverseDiffAD) # uses ReverseDiffAD backend.
using Zygote
ADNLSModel(F, x0, nequ; backend = ADNLPModels.ZygoteAD)
using ADNLPModels
F(x) = [x[2]; x[1]]
nequ = 2
x0 = ones(3)
c(x) = [1x[1] + x[2]; x[2]]
nvar, ncon = 3, 2
ADNLSModel(F, x0, nequ, c, zeros(ncon), zeros(ncon)) # uses the default ForwardDiffAD backend.
ADNLSModel(F, x0, nequ, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ReverseDiffAD) # uses ReverseDiffAD backend.
using Zygote
ADNLSModel(F, x0, nequ, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ZygoteAD)
For in-place constraints and residual function, use one of the following constructors:
ADNLSModel!(F!, x0, nequ)
ADNLSModel!(F!, x0, nequ, lvar, uvar)
ADNLSModel!(F!, x0, nequ, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, clinrows, clincols, clinvals, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, clinrows, clincols, clinvals, lcon, ucon)
ADNLSModel!(F!, x0, nequ, A, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, A, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, clinrows, clincols, clinvals, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, clinrows, clincols, clinvals, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, A, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, A, clcon, ucon)
ADNLSModel!(model::AbstractNLSModel)
where the constraint function has the signature c!(output, input)
.
using ADNLPModels
function F!(output, x)
output[1] = x[2]
output[2] = x[1]
end
nequ = 2
x0 = ones(3)
function c!(output, x)
output[1] = 1x[1] + x[2]
output[2] = x[2]
end
nvar, ncon = 3, 2
nls = ADNLSModel!(F!, x0, nequ, c!, zeros(ncon), zeros(ncon))
ADNLPModels.compute_hessian_sparsity
— Methodcompute_hessian_sparsity(f, nvar, c!, ncon; detector)
Return a sparse boolean matrix that represents the adjacency matrix of the Hessian of f(x) + λᵀc(x).
ADNLPModels.compute_jacobian_sparsity
— Functioncompute_jacobian_sparsity(c, x0; detector)
compute_jacobian_sparsity(c!, cx, x0; detector)
Return a sparse boolean matrix that represents the adjacency matrix of the Jacobian of c(x).
ADNLPModels.get_F
— Methodget_F(nls)
get_F(nls, ::ADBackend)
Return the out-of-place version of nls.F!
.
ADNLPModels.get_adbackend
— Methodget_adbackend(nlp)
Returns the value adbackend
from nlp.
ADNLPModels.get_c
— Methodget_c(nlp)
get_c(nlp, ::ADBackend)
Return the out-of-place version of nlp.c!
.
ADNLPModels.get_default_backend
— Methodget_default_backend(meth::Symbol, backend::Symbol; kwargs...)
get_default_backend(::Val{::Symbol}, backend; kwargs...)
Return a type <:ADBackend
that corresponds to the default backend
use for the method meth
. See keys(ADNLPModels.predefined_backend)
for a list of possible backends.
The following keyword arguments are accepted:
matrix_free::Bool
: Iftrue
, this returns anEmptyADbackend
for methods that handle matrices, e.g.:hessian_backend
.
ADNLPModels.get_lag
— Methodget_lag(nlp, b::ADBackend, obj_weight)
get_lag(nlp, b::ADBackend, obj_weight, y)
Return the lagrangian function ℓ(x) = obj_weight * f(x) + c(x)ᵀy
.
ADNLPModels.get_nln_nnzh
— Methodget_nln_nnzh(::ADBackend, nvar)
get_nln_nnzh(b::ADModelBackend, nvar)
get_nln_nnzh(nlp::AbstractNLPModel, nvar)
For a given ADBackend
of a problem with nvar
variables, return the number of nonzeros in the lower triangle of the Hessian. If b
is the ADModelBackend
then b.hessian_backend
is used.
ADNLPModels.get_nln_nnzj
— Methodget_nln_nnzj(::ADBackend, nvar, ncon)
get_nln_nnzj(b::ADModelBackend, nvar, ncon)
get_nln_nnzj(nlp::AbstractNLPModel, nvar, ncon)
For a given ADBackend
of a problem with nvar
variables and ncon
constraints, return the number of nonzeros in the Jacobian of nonlinear constraints. If b
is the ADModelBackend
then b.jacobian_backend
is used.
ADNLPModels.get_residual_nnzh
— Methodget_residual_nnzh(b::ADModelBackend, nvar)
get_residual_nnzh(nls::AbstractNLSModel, nvar)
Return the number of nonzeros elements in the residual Hessians.
ADNLPModels.get_residual_nnzj
— Methodget_residual_nnzj(b::ADModelBackend, nvar, nequ)
get_residual_nnzj(nls::AbstractNLSModel, nvar, nequ)
Return the number of nonzeros elements in the residual Jacobians.
ADNLPModels.set_adbackend!
— Methodset_adbackend!(nlp, new_adbackend)
set_adbackend!(nlp; kwargs...)
Replace the current adbackend
value of nlp by new_adbackend
or instantiate a new one with kwargs
, see ADModelBackend
. By default, the setter with kwargs will reuse existing backends.