Reference

Contents

Index

ADNLPModels.ADModelBackendType
ADModelBackend(gradient_backend, hprod_backend, jprod_backend, jtprod_backend, jacobian_backend, hessian_backend, ghjvprod_backend, hprod_residual_backend, jprod_residual_backend, jtprod_residual_backend, jacobian_residual_backend, hessian_residual_backend)

Structure that define the different backend used to compute automatic differentiation of an ADNLPModel/ADNLSModel model. The different backend are all subtype of ADBackend and are respectively used for:

  • gradient computation;
  • hessian-vector products;
  • jacobian-vector products;
  • transpose jacobian-vector products;
  • jacobian computation;
  • hessian computation;
  • directional second derivative computation, i.e. gᵀ ∇²cᵢ(x) v.

The default constructors are ADModelBackend(nvar, f, ncon = 0, c = (args...) -> []; showtime::Bool = false, kwargs...) ADModelNLSBackend(nvar, F!, nequ, ncon = 0, c = (args...) -> []; showtime::Bool = false, kwargs...)

If show_time is set to true, it prints the time used to generate each backend.

The remaining kwargs are either the different backends as listed below or arguments passed to the backend's constructors:

  • gradient_backend = ForwardDiffADGradient;
  • hprod_backend = ForwardDiffADHvprod;
  • jprod_backend = ForwardDiffADJprod;
  • jtprod_backend = ForwardDiffADJtprod;
  • jacobian_backend = SparseADJacobian;
  • hessian_backend = ForwardDiffADHessian;
  • ghjvprod_backend = ForwardDiffADGHjvprod;
  • hprod_residual_backend = ForwardDiffADHvprod for ADNLSModel and EmptyADbackend otherwise;
  • jprod_residual_backend = ForwardDiffADJprod for ADNLSModel and EmptyADbackend otherwise;
  • jtprod_residual_backend = ForwardDiffADJtprod for ADNLSModel and EmptyADbackend otherwise;
  • jacobian_residual_backend = SparseADJacobian for ADNLSModel and EmptyADbackend otherwise;
  • hessian_residual_backend = ForwardDiffADHessian for ADNLSModel and EmptyADbackend otherwise.
source
ADNLPModels.ADNLPModelMethod
ADNLPModel(f, x0)
ADNLPModel(f, x0, lvar, uvar)
ADNLPModel(f, x0, clinrows, clincols, clinvals, lcon, ucon)
ADNLPModel(f, x0, A, lcon, ucon)
ADNLPModel(f, x0, c, lcon, ucon)
ADNLPModel(f, x0, clinrows, clincols, clinvals, c, lcon, ucon)
ADNLPModel(f, x0, A, c, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, clinrows, clincols, clinvals, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, A, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, c, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, clinrows, clincols, clinvals, c, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, A, c, lcon, ucon)
ADNLPModel(model::AbstractNLPModel)

ADNLPModel is an AbstractNLPModel using automatic differentiation to compute the derivatives. The problem is defined as

 min  f(x)
s.to  lcon ≤ (  Ax  ) ≤ ucon
             ( c(x) )
      lvar ≤   x  ≤ uvar.

The following keyword arguments are available to all constructors:

  • minimize: A boolean indicating whether this is a minimization problem (default: true)
  • name: The name of the model (default: "Generic")

The following keyword arguments are available to the constructors for constrained problems:

  • y0: An inital estimate to the Lagrangian multipliers (default: zeros)

ADNLPModel uses ForwardDiff and ReverseDiff for the automatic differentiation. One can specify a new backend with the keyword arguments backend::ADNLPModels.ADBackend. There are three pre-coded backends:

  • the default ForwardDiffAD.
  • ReverseDiffAD.
  • ZygoteDiffAD accessible after loading Zygote.jl in your environment.

For an advanced usage, one can define its own backend and redefine the API as done in ADNLPModels.jl/src/forward.jl.

Examples

using ADNLPModels
f(x) = sum(x)
x0 = ones(3)
nvar = 3
ADNLPModel(f, x0) # uses the default ForwardDiffAD backend.
ADNLPModel(f, x0; backend = ADNLPModels.ReverseDiffAD) # uses ReverseDiffAD backend.

using Zygote
ADNLPModel(f, x0; backend = ADNLPModels.ZygoteAD)
using ADNLPModels
f(x) = sum(x)
x0 = ones(3)
c(x) = [1x[1] + x[2]; x[2]]
nvar, ncon = 3, 2
ADNLPModel(f, x0, c, zeros(ncon), zeros(ncon)) # uses the default ForwardDiffAD backend.
ADNLPModel(f, x0, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ReverseDiffAD) # uses ReverseDiffAD backend.

using Zygote
ADNLPModel(f, x0, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ZygoteAD)

For in-place constraints function, use one of the following constructors:

ADNLPModel!(f, x0, c!, lcon, ucon)
ADNLPModel!(f, x0, clinrows, clincols, clinvals, c!, lcon, ucon)
ADNLPModel!(f, x0, A, c!, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, c!, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, clinrows, clincols, clinvals, c!, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, A, c!, lcon, ucon)
ADNLSModel!(model::AbstractNLSModel)

where the constraint function has the signature c!(output, input).

using ADNLPModels
f(x) = sum(x)
x0 = ones(3)
function c!(output, x) 
  output[1] = 1x[1] + x[2]
  output[2] = x[2]
end
nvar, ncon = 3, 2
nlp = ADNLPModel!(f, x0, c!, zeros(ncon), zeros(ncon)) # uses the default ForwardDiffAD backend.
source
ADNLPModels.ADNLSModelMethod
ADNLSModel(F, x0, nequ)
ADNLSModel(F, x0, nequ, lvar, uvar)
ADNLSModel(F, x0, nequ, clinrows, clincols, clinvals, lcon, ucon)
ADNLSModel(F, x0, nequ, A, lcon, ucon)
ADNLSModel(F, x0, nequ, c, lcon, ucon)
ADNLSModel(F, x0, nequ, clinrows, clincols, clinvals, c, lcon, ucon)
ADNLSModel(F, x0, nequ, A, c, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, clinrows, clincols, clinvals, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, A, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, c, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, clinrows, clincols, clinvals, c, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, A, c, lcon, ucon)
ADNLSModel(model::AbstractNLSModel)

ADNLSModel is an Nonlinear Least Squares model using automatic differentiation to compute the derivatives. The problem is defined as

 min  ½‖F(x)‖²
s.to  lcon ≤ (  Ax  ) ≤ ucon
             ( c(x) )
      lvar ≤   x  ≤ uvar

where nequ is the size of the vector F(x) and the linear constraints come first.

The following keyword arguments are available to all constructors:

  • linequ: An array of indexes of the linear equations (default: Int[])
  • minimize: A boolean indicating whether this is a minimization problem (default: true)
  • name: The name of the model (default: "Generic")

The following keyword arguments are available to the constructors for constrained problems:

  • y0: An inital estimate to the Lagrangian multipliers (default: zeros)

ADNLSModel uses ForwardDiff and ReverseDiff for the automatic differentiation. One can specify a new backend with the keyword arguments backend::ADNLPModels.ADBackend. There are three pre-coded backends:

  • the default ForwardDiffAD.
  • ReverseDiffAD.
  • ZygoteDiffAD accessible after loading Zygote.jl in your environment.

For an advanced usage, one can define its own backend and redefine the API as done in ADNLPModels.jl/src/forward.jl.

Examples

using ADNLPModels
F(x) = [x[2]; x[1]]
nequ = 2
x0 = ones(3)
nvar = 3
ADNLSModel(F, x0, nequ) # uses the default ForwardDiffAD backend.
ADNLSModel(F, x0, nequ; backend = ADNLPModels.ReverseDiffAD) # uses ReverseDiffAD backend.

using Zygote
ADNLSModel(F, x0, nequ; backend = ADNLPModels.ZygoteAD)
using ADNLPModels
F(x) = [x[2]; x[1]]
nequ = 2
x0 = ones(3)
c(x) = [1x[1] + x[2]; x[2]]
nvar, ncon = 3, 2
ADNLSModel(F, x0, nequ, c, zeros(ncon), zeros(ncon)) # uses the default ForwardDiffAD backend.
ADNLSModel(F, x0, nequ, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ReverseDiffAD) # uses ReverseDiffAD backend.

using Zygote
ADNLSModel(F, x0, nequ, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ZygoteAD)

For in-place constraints and residual function, use one of the following constructors:

ADNLSModel!(F!, x0, nequ)
ADNLSModel!(F!, x0, nequ, lvar, uvar)
ADNLSModel!(F!, x0, nequ, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, clinrows, clincols, clinvals, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, clinrows, clincols, clinvals, lcon, ucon)
ADNLSModel!(F!, x0, nequ, A, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, A, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, clinrows, clincols, clinvals, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, clinrows, clincols, clinvals, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, A, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, A, clcon, ucon)
ADNLSModel!(model::AbstractNLSModel)

where the constraint function has the signature c!(output, input).

using ADNLPModels
function F!(output, x)
  output[1] = x[2]
  output[2] = x[1]
end
nequ = 2
x0 = ones(3)
function c!(output, x) 
  output[1] = 1x[1] + x[2]
  output[2] = x[2]
end
nvar, ncon = 3, 2
nls = ADNLSModel!(F!, x0, nequ, c!, zeros(ncon), zeros(ncon))
source
ADNLPModels.compute_jacobian_sparsityFunction
compute_jacobian_sparsity(c, x0; detector)
compute_jacobian_sparsity(c!, cx, x0; detector)

Return a sparse boolean matrix that represents the adjacency matrix of the Jacobian of c(x).

source
ADNLPModels.get_default_backendMethod
get_default_backend(meth::Symbol, backend::Symbol; kwargs...)
get_default_backend(::Val{::Symbol}, backend; kwargs...)

Return a type <:ADBackend that corresponds to the default backend use for the method meth. See keys(ADNLPModels.predefined_backend) for a list of possible backends.

The following keyword arguments are accepted:

  • matrix_free::Bool: If true, this returns an EmptyADbackend for methods that handle matrices, e.g. :hessian_backend.
source
ADNLPModels.get_lagMethod
get_lag(nlp, b::ADBackend, obj_weight)
get_lag(nlp, b::ADBackend, obj_weight, y)

Return the lagrangian function ℓ(x) = obj_weight * f(x) + c(x)ᵀy.

source
ADNLPModels.get_nln_nnzhMethod
get_nln_nnzh(::ADBackend, nvar)
get_nln_nnzh(b::ADModelBackend, nvar)
get_nln_nnzh(nlp::AbstractNLPModel, nvar)

For a given ADBackend of a problem with nvar variables, return the number of nonzeros in the lower triangle of the Hessian. If b is the ADModelBackend then b.hessian_backend is used.

source
ADNLPModels.get_nln_nnzjMethod
get_nln_nnzj(::ADBackend, nvar, ncon)
get_nln_nnzj(b::ADModelBackend, nvar, ncon)
get_nln_nnzj(nlp::AbstractNLPModel, nvar, ncon)

For a given ADBackend of a problem with nvar variables and ncon constraints, return the number of nonzeros in the Jacobian of nonlinear constraints. If b is the ADModelBackend then b.jacobian_backend is used.

source
ADNLPModels.get_residual_nnzhMethod
get_residual_nnzh(b::ADModelBackend, nvar)
get_residual_nnzh(nls::AbstractNLSModel, nvar)

Return the number of nonzeros elements in the residual Hessians.

source
ADNLPModels.get_residual_nnzjMethod
get_residual_nnzj(b::ADModelBackend, nvar, nequ)
get_residual_nnzj(nls::AbstractNLSModel, nvar, nequ)

Return the number of nonzeros elements in the residual Jacobians.

source
ADNLPModels.get_sparsity_patternMethod
S = get_sparsity_pattern(model::ADModel, derivative::Symbol)

Retrieve the sparsity pattern of a Jacobian or Hessian from an ADModel. For the Hessian, only the lower triangular part of its sparsity pattern is returned. The user can reconstruct the upper triangular part by exploiting symmetry.

To compute the sparsity pattern, the model must use a sparse backend. Supported backends include SparseADJacobian, SparseADHessian, and SparseReverseADHessian.

Input arguments

  • model: An automatic differentiation model (either AbstractADNLPModel or AbstractADNLSModel).
  • derivative: The type of derivative for which the sparsity pattern is needed. The supported values are :jacobian, :hessian, :jacobian_residual and :hessian_residual.

Output argument

  • S: A sparse matrix of type SparseMatrixCSC{Bool,Int} indicating the sparsity pattern of the requested derivative.
source
ADNLPModels.set_adbackend!Method
set_adbackend!(nlp, new_adbackend)
set_adbackend!(nlp; kwargs...)

Replace the current adbackend value of nlp by new_adbackend or instantiate a new one with kwargs, see ADModelBackend. By default, the setter with kwargs will reuse existing backends.

source