Reference
Contents
Index
NLPModelsModifiers.FeasibilityFormNLSNLPModelsModifiers.FeasibilityFormNLSNLPModelsModifiers.FeasibilityResidualNLPModelsModifiers.LBFGSModelNLPModelsModifiers.LSR1ModelNLPModelsModifiers.SlackModelNLPModelsModifiers.SlackModelNLPModelsModifiers.SlackNLSModelNLPModelsModifiers.DiagonalAndreiModelNLPModelsModifiers.DiagonalPSBModelNLPModelsModifiers.SpectralGradientModelNLPModelsModifiers.get_relative_indicesNLPModelsModifiers.get_slack_indNLPModelsModifiers.relative_columns_indices!
NLPModelsModifiers.FeasibilityFormNLS — TypeConverts a nonlinear least-squares problem with residual $F(x)$ to a nonlinear optimization problem with constraints $F(x) = r$ and objective $\tfrac{1}{2}\|r\|^2$. In other words, converts
\[\begin{aligned} \min_x \quad & \tfrac{1}{2}\|F(x)\|^2 \\ \mathrm{s.t.} \quad & c_L ≤ c(x) ≤ c_U \\ & ℓ ≤ x ≤ u \end{aligned}\]
to
\[\begin{aligned} \min_{x,r} \quad & \tfrac{1}{2}\|r\|^2 \\ \mathrm{s.t.} \quad & F(x) - r = 0 \\ & c_L ≤ c(x) ≤ c_U \\ & ℓ ≤ x ≤ u \end{aligned}\]
If you rather have the first problem, the nls model already works as an NLPModel of that format.
NLPModelsModifiers.FeasibilityFormNLS — MethodFeasibilityFormNLS(nls)Converts a nonlinear least-squares problem with residual F(x) to a nonlinear optimization problem with constraints F(x) = r and objective ¹/₂‖r‖².
NLPModelsModifiers.FeasibilityResidual — TypeA feasibility residual model is created from a NLPModel of the form
\[\begin{aligned} \min_x \quad & f(x) \\ \mathrm{s.t.} \quad & c_L ≤ c(x) ≤ c_U \\ & \ell ≤ x ≤ u, \end{aligned}\]
by creating slack variables $s = c(x)$ and defining an NLS problem from the equality constraints. The resulting problem is a bound-constrained nonlinear least-squares problem with residual function NLPModels.$F(x,s) = c(x) - s$:
\[\begin{aligned} \min_{x,s} \quad & \tfrac{1}{2} \|c(x) - s\|^2 \\ \mathrm{s.t.} \quad & \ell ≤ x ≤ u \\ & c_L ≤ s ≤ c_U. \end{aligned}\]
Notice that this problem is an AbstractNLSModel, thus the residual value, Jacobian and Hessian are explicitly defined through the NLS API. The slack variables are created using SlackModel. If $\ell_i = u_i$, no slack variable is created. In particular, if there are only equality constrained of the form $c(x) = 0$, the resulting NLS is simply $\min_x \tfrac{1}{2}\|c(x)\|^2$.
NLPModelsModifiers.LBFGSModel — MethodConstruct a LBFGSModel from another type of model.
NLPModelsModifiers.LSR1Model — MethodConstruct a LSR1Model from another type of nlp.
NLPModelsModifiers.SlackModel — TypeA model whose only inequality constraints are bounds.
Given a model, this type represents a second model in which slack variables are introduced so as to convert linear and nonlinear inequality constraints to equality constraints and bounds. More precisely, if the original model has the form
\[\begin{aligned} \min_x \quad & f(x)\\ \mathrm{s.t.} \quad & c_L ≤ c(x) ≤ c_U,\\ & ℓ ≤ x ≤ u, \end{aligned}\]
the new model appears to the user as
\[\begin{aligned} \min_X \quad & f(X)\\ \mathrm{s.t.} \quad & g(X) = 0,\\ & L ≤ X ≤ U. \end{aligned}\]
The unknowns $X = (x, s)$ contain the original variables and slack variables $s$. The latter are such that the new model has the general form
\[\begin{aligned} \min_x \quad & f(x)\\ \mathrm{s.t.} \quad & c(x) - s = 0,\\ & c_L ≤ s ≤ c_U,\\ & ℓ ≤ x ≤ u. \end{aligned}\]
although no slack variables are introduced for equality constraints.
The slack variables are implicitly ordered as linear and then nonlinear, and as [s(low), s(upp), s(rng)], where low, upp and rng represent the indices of the constraints of the form $c_L ≤ c(x) < ∞$, $-∞ < c(x) ≤ c_U$ and $c_L ≤ c(x) ≤ c_U$, respectively.
NLPModelsModifiers.SlackModel — MethodConstruct a SlackModel from another type of model.
NLPModelsModifiers.SlackNLSModel — TypeLike SlackModel, this model converts inequalities into equalities and bounds.
NLPModelsModifiers.DiagonalAndreiModel — MethodDiagonalAndreiModel(nlp; d0 = fill!(S(undef, nlp.meta.nvar), 1.0))Construct a DiagonalAndreiModel from another type of nlp, in which the Hessian is approximated via a diagonal Andrei quasi-Newton operator. d0 is the initial approximation of the diagonal of the Hessian, and by default a vector of ones. See the DiagonalAndrei operator documentation.
NLPModelsModifiers.DiagonalPSBModel — MethodDiagonalPSBModel(nlp; d0 = fill!(S(undef, nlp.meta.nvar), 1.0))Construct a DiagonalPSBModel from another type of nlp, in which the Hessian is approximated via a diagonal PSB quasi-Newton operator. d0 is the initial approximation of the diagonal of the Hessian, and by default a vector of ones. See the DiagonalPSB operator documentation.
NLPModelsModifiers.SpectralGradientModel — MethodSpectralGradientModel(nlp; σ = 1.0)Construct a SpectralGradientModel rhat approximates the Hessian as σI from another type of nlp. The keyword argument σ is the initial positive multiple of the identity. See the SpectralGradient operator documentation for more information about the used algorithms.
NLPModelsModifiers.get_relative_indices — Methodget_relative_indices(model)Return the relative indices of jlow, jupp and jrng within the set of linear and nonlinear indices.
NLPModelsModifiers.get_slack_ind — Methodget_slack_ind(jl, ind)Return the relative indices of the set of indices jl within the set of indices ind.
NLPModelsModifiers.relative_columns_indices! — Methodrelative_columns_indices!(cols, nlp, nj, n, jlow, model_jlow)The input jlow should be relative indices within nlp.model.meta.nln. Add to cols[nj .+ (1:length(jlow))] the indices of n + jind with jind indices of jlow relative to setdiff(nlp.model.meta.nln, nlp.model.meta.jfix).
This is the 0-allocation version of:
ind = setdiff(nlp.model.meta.nln, nlp.model.meta.jfix)
jlow_nln = get_slack_ind(nlp.model.meta.jlow, ind)
cols[(nj + 1):(nj + lj)] .= (n .+ jlow_nln)