ADNLPModels

This package provides automatic differentiation (AD)-based model implementations that conform to the NLPModels API. The general form of the optimization problem is

\[\begin{aligned} \min \quad & f(x) \\ & c_L \leq c(x) \leq c_U \\ & \ell \leq x \leq u, \end{aligned}\]

Install

ADNLPModels Julia Language package. To install ADNLPModels, please open Julia's interactive session (known as REPL) and press the ] key in the REPL to use the package mode, then type the following command

pkg> add ADNLPModels

Complementary packages

ADNLPModels.jl functionalities are extended by other packages that are not automatically loaded. In other words, you sometimes need to load the desired package separately to access some functionalities.

using ADNLPModels # load only the default functionalities
using Zygote # load the Zygote backends

Versions compatibility for the extensions are available in the file test/Project.toml.

print(open(io->read(io, String), "../../test/Project.toml"))
[deps]
CUDA = "052768ef-5323-5732-b1bb-66c8b64840ba"
Enzyme = "7da242da-08ed-463a-9acd-ee780be4f1d9"
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
ManualNLPModels = "30dfa513-9b2f-4fb3-9796-781eabac1617"
NLPModels = "a4795742-8479-5a88-8948-cc11e1c8c1a6"
NLPModelsModifiers = "e01155f1-5c6f-4375-a9d8-616dd036575f"
NLPModelsTest = "7998695d-6960-4d3a-85c4-e1bceb8cd856"
ReverseDiff = "37e2e3b7-166d-5795-8a7a-e32c996b4267"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
SparseMatrixColorings = "0a514795-09f3-496d-8182-132a7b665d35"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"

[compat]
CUDA = "4, 5"
Enzyme = "0.10, 0.11, 0.12"
ForwardDiff = "0.10"
ManualNLPModels = "0.1"
NLPModels = "0.21"
NLPModelsModifiers = "0.7"
NLPModelsTest = "0.10"
ReverseDiff = "1"
SparseMatrixColorings = "0.4.0"
Zygote = "0.6"

Usage

This package defines two models, ADNLPModel for general nonlinear optimization, and ADNLSModel for nonlinear least-squares problems.

ADNLPModels.ADNLPModelType
ADNLPModel(f, x0)
ADNLPModel(f, x0, lvar, uvar)
ADNLPModel(f, x0, clinrows, clincols, clinvals, lcon, ucon)
ADNLPModel(f, x0, A, lcon, ucon)
ADNLPModel(f, x0, c, lcon, ucon)
ADNLPModel(f, x0, clinrows, clincols, clinvals, c, lcon, ucon)
ADNLPModel(f, x0, A, c, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, clinrows, clincols, clinvals, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, A, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, c, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, clinrows, clincols, clinvals, c, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, A, c, lcon, ucon)
ADNLPModel(model::AbstractNLPModel)

ADNLPModel is an AbstractNLPModel using automatic differentiation to compute the derivatives. The problem is defined as

 min  f(x)
s.to  lcon ≤ (  Ax  ) ≤ ucon
             ( c(x) )
      lvar ≤   x  ≤ uvar.

The following keyword arguments are available to all constructors:

  • minimize: A boolean indicating whether this is a minimization problem (default: true)
  • name: The name of the model (default: "Generic")

The following keyword arguments are available to the constructors for constrained problems:

  • y0: An inital estimate to the Lagrangian multipliers (default: zeros)

ADNLPModel uses ForwardDiff and ReverseDiff for the automatic differentiation. One can specify a new backend with the keyword arguments backend::ADNLPModels.ADBackend. There are three pre-coded backends:

  • the default ForwardDiffAD.
  • ReverseDiffAD.
  • ZygoteDiffAD accessible after loading Zygote.jl in your environment.

For an advanced usage, one can define its own backend and redefine the API as done in ADNLPModels.jl/src/forward.jl.

Examples

using ADNLPModels
f(x) = sum(x)
x0 = ones(3)
nvar = 3
ADNLPModel(f, x0) # uses the default ForwardDiffAD backend.
ADNLPModel(f, x0; backend = ADNLPModels.ReverseDiffAD) # uses ReverseDiffAD backend.

using Zygote
ADNLPModel(f, x0; backend = ADNLPModels.ZygoteAD)
using ADNLPModels
f(x) = sum(x)
x0 = ones(3)
c(x) = [1x[1] + x[2]; x[2]]
nvar, ncon = 3, 2
ADNLPModel(f, x0, c, zeros(ncon), zeros(ncon)) # uses the default ForwardDiffAD backend.
ADNLPModel(f, x0, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ReverseDiffAD) # uses ReverseDiffAD backend.

using Zygote
ADNLPModel(f, x0, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ZygoteAD)

For in-place constraints function, use one of the following constructors:

ADNLPModel!(f, x0, c!, lcon, ucon)
ADNLPModel!(f, x0, clinrows, clincols, clinvals, c!, lcon, ucon)
ADNLPModel!(f, x0, A, c!, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, c!, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, clinrows, clincols, clinvals, c!, lcon, ucon)
ADNLPModel(f, x0, lvar, uvar, A, c!, lcon, ucon)
ADNLSModel!(model::AbstractNLSModel)

where the constraint function has the signature c!(output, input).

using ADNLPModels
f(x) = sum(x)
x0 = ones(3)
function c!(output, x) 
  output[1] = 1x[1] + x[2]
  output[2] = x[2]
end
nvar, ncon = 3, 2
nlp = ADNLPModel!(f, x0, c!, zeros(ncon), zeros(ncon)) # uses the default ForwardDiffAD backend.
source
ADNLPModels.ADNLSModelType
ADNLSModel(F, x0, nequ)
ADNLSModel(F, x0, nequ, lvar, uvar)
ADNLSModel(F, x0, nequ, clinrows, clincols, clinvals, lcon, ucon)
ADNLSModel(F, x0, nequ, A, lcon, ucon)
ADNLSModel(F, x0, nequ, c, lcon, ucon)
ADNLSModel(F, x0, nequ, clinrows, clincols, clinvals, c, lcon, ucon)
ADNLSModel(F, x0, nequ, A, c, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, clinrows, clincols, clinvals, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, A, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, c, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, clinrows, clincols, clinvals, c, lcon, ucon)
ADNLSModel(F, x0, nequ, lvar, uvar, A, c, lcon, ucon)
ADNLSModel(model::AbstractNLSModel)

ADNLSModel is an Nonlinear Least Squares model using automatic differentiation to compute the derivatives. The problem is defined as

 min  ½‖F(x)‖²
s.to  lcon ≤ (  Ax  ) ≤ ucon
             ( c(x) )
      lvar ≤   x  ≤ uvar

where nequ is the size of the vector F(x) and the linear constraints come first.

The following keyword arguments are available to all constructors:

  • linequ: An array of indexes of the linear equations (default: Int[])
  • minimize: A boolean indicating whether this is a minimization problem (default: true)
  • name: The name of the model (default: "Generic")

The following keyword arguments are available to the constructors for constrained problems:

  • y0: An inital estimate to the Lagrangian multipliers (default: zeros)

ADNLSModel uses ForwardDiff and ReverseDiff for the automatic differentiation. One can specify a new backend with the keyword arguments backend::ADNLPModels.ADBackend. There are three pre-coded backends:

  • the default ForwardDiffAD.
  • ReverseDiffAD.
  • ZygoteDiffAD accessible after loading Zygote.jl in your environment.

For an advanced usage, one can define its own backend and redefine the API as done in ADNLPModels.jl/src/forward.jl.

Examples

using ADNLPModels
F(x) = [x[2]; x[1]]
nequ = 2
x0 = ones(3)
nvar = 3
ADNLSModel(F, x0, nequ) # uses the default ForwardDiffAD backend.
ADNLSModel(F, x0, nequ; backend = ADNLPModels.ReverseDiffAD) # uses ReverseDiffAD backend.

using Zygote
ADNLSModel(F, x0, nequ; backend = ADNLPModels.ZygoteAD)
using ADNLPModels
F(x) = [x[2]; x[1]]
nequ = 2
x0 = ones(3)
c(x) = [1x[1] + x[2]; x[2]]
nvar, ncon = 3, 2
ADNLSModel(F, x0, nequ, c, zeros(ncon), zeros(ncon)) # uses the default ForwardDiffAD backend.
ADNLSModel(F, x0, nequ, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ReverseDiffAD) # uses ReverseDiffAD backend.

using Zygote
ADNLSModel(F, x0, nequ, c, zeros(ncon), zeros(ncon); backend = ADNLPModels.ZygoteAD)

For in-place constraints and residual function, use one of the following constructors:

ADNLSModel!(F!, x0, nequ)
ADNLSModel!(F!, x0, nequ, lvar, uvar)
ADNLSModel!(F!, x0, nequ, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, clinrows, clincols, clinvals, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, clinrows, clincols, clinvals, lcon, ucon)
ADNLSModel!(F!, x0, nequ, A, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, A, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, clinrows, clincols, clinvals, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, clinrows, clincols, clinvals, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, A, c!, lcon, ucon)
ADNLSModel!(F!, x0, nequ, lvar, uvar, A, clcon, ucon)
ADNLSModel!(model::AbstractNLSModel)

where the constraint function has the signature c!(output, input).

using ADNLPModels
function F!(output, x)
  output[1] = x[2]
  output[2] = x[1]
end
nequ = 2
x0 = ones(3)
function c!(output, x) 
  output[1] = 1x[1] + x[2]
  output[2] = x[2]
end
nvar, ncon = 3, 2
nls = ADNLSModel!(F!, x0, nequ, c!, zeros(ncon), zeros(ncon))
source

Check the Tutorial for more details on the usage.

License

This content is released under the MPL2.0 License.

Bug reports and discussions

If you think you found a bug, feel free to open an issue. Focused suggestions and requests can also be opened as issues. Before opening a pull request, start an issue or a discussion on the topic, please.

If you want to ask a question not suited for a bug report, feel free to start a discussion here. This forum is for general discussion about this repository and the JuliaSmoothOptimizers, so questions about any of our packages are welcome.

Contents