Reference
Contents
Index
NLPModels.AbstractNLPModelNLPModels.AbstractNLPModelMetaNLPModels.AbstractNLSModelNLPModels.CountersNLPModels.DimensionErrorNLPModels.NLPModelMetaNLPModels.NLSCountersNLPModels.NLSMetaBase.eltypeLinearOperators.reset!LinearOperators.reset!NLPModels.bound_constrainedNLPModels.consNLPModels.cons!NLPModels.cons_linNLPModels.cons_lin!NLPModels.cons_nlnNLPModels.cons_nln!NLPModels.conscaleNLPModels.coo_prod!NLPModels.coo_sym_prod!NLPModels.decrement!NLPModels.equality_constrainedNLPModels.get_ifixNLPModels.get_ifreeNLPModels.get_iinfNLPModels.get_ilowNLPModels.get_irngNLPModels.get_islpNLPModels.get_iuppNLPModels.get_jfixNLPModels.get_jfreeNLPModels.get_jinfNLPModels.get_jlowNLPModels.get_jrngNLPModels.get_juppNLPModels.get_lconNLPModels.get_linNLPModels.get_linNLPModels.get_lin_nnzjNLPModels.get_lvarNLPModels.get_minimizeNLPModels.get_nameNLPModels.get_nconNLPModels.get_nequNLPModels.get_nlinNLPModels.get_nlinNLPModels.get_nlnNLPModels.get_nlnNLPModels.get_nln_nnzjNLPModels.get_nlvbNLPModels.get_nlvcNLPModels.get_nlvoNLPModels.get_nnlnNLPModels.get_nnlnNLPModels.get_nnzhNLPModels.get_nnzhNLPModels.get_nnzjNLPModels.get_nnzjNLPModels.get_nnzoNLPModels.get_nvarNLPModels.get_nvarNLPModels.get_uconNLPModels.get_uvarNLPModels.get_x0NLPModels.get_x0NLPModels.get_y0NLPModels.ghjvprodNLPModels.ghjvprod!NLPModels.gradNLPModels.grad!NLPModels.grad!NLPModels.has_boundsNLPModels.has_equalitiesNLPModels.has_inequalitiesNLPModels.hessNLPModels.hessNLPModels.hess_coordNLPModels.hess_coordNLPModels.hess_coord!NLPModels.hess_coord!NLPModels.hess_coord_residualNLPModels.hess_coord_residual!NLPModels.hess_opNLPModels.hess_opNLPModels.hess_op!NLPModels.hess_op!NLPModels.hess_op!NLPModels.hess_op_residualNLPModels.hess_op_residual!NLPModels.hess_residualNLPModels.hess_structureNLPModels.hess_structure!NLPModels.hess_structure_residualNLPModels.hess_structure_residual!NLPModels.histlineNLPModels.hprodNLPModels.hprodNLPModels.hprod!NLPModels.hprod!NLPModels.hprod!NLPModels.hprod_residualNLPModels.hprod_residual!NLPModels.increment!NLPModels.increment!NLPModels.inequality_constrainedNLPModels.jacNLPModels.jac_coordNLPModels.jac_coord!NLPModels.jac_coord_residualNLPModels.jac_coord_residual!NLPModels.jac_linNLPModels.jac_lin_coordNLPModels.jac_lin_coord!NLPModels.jac_lin_opNLPModels.jac_lin_op!NLPModels.jac_lin_op!NLPModels.jac_lin_structureNLPModels.jac_lin_structure!NLPModels.jac_nlnNLPModels.jac_nln_coordNLPModels.jac_nln_coord!NLPModels.jac_nln_opNLPModels.jac_nln_op!NLPModels.jac_nln_op!NLPModels.jac_nln_structureNLPModels.jac_nln_structure!NLPModels.jac_opNLPModels.jac_op!NLPModels.jac_op!NLPModels.jac_op_residualNLPModels.jac_op_residual!NLPModels.jac_op_residual!NLPModels.jac_residualNLPModels.jac_structureNLPModels.jac_structure!NLPModels.jac_structure_residualNLPModels.jac_structure_residual!NLPModels.jprodNLPModels.jprod!NLPModels.jprod!NLPModels.jprod_linNLPModels.jprod_lin!NLPModels.jprod_lin!NLPModels.jprod_nlnNLPModels.jprod_nln!NLPModels.jprod_nln!NLPModels.jprod_residualNLPModels.jprod_residual!NLPModels.jprod_residual!NLPModels.jth_hessNLPModels.jth_hess_coordNLPModels.jth_hess_coord!NLPModels.jth_hess_residualNLPModels.jth_hess_residual_coordNLPModels.jth_hess_residual_coord!NLPModels.jth_hprodNLPModels.jth_hprod!NLPModels.jtprodNLPModels.jtprod!NLPModels.jtprod!NLPModels.jtprod_linNLPModels.jtprod_lin!NLPModels.jtprod_lin!NLPModels.jtprod_nlnNLPModels.jtprod_nln!NLPModels.jtprod_nln!NLPModels.jtprod_residualNLPModels.jtprod_residual!NLPModels.jtprod_residual!NLPModels.lagscaleNLPModels.linearly_constrainedNLPModels.lines_of_descriptionNLPModels.lines_of_descriptionNLPModels.lines_of_histNLPModels.neval_consNLPModels.neval_cons_linNLPModels.neval_cons_nlnNLPModels.neval_gradNLPModels.neval_hessNLPModels.neval_hess_residualNLPModels.neval_hprodNLPModels.neval_hprod_residualNLPModels.neval_jacNLPModels.neval_jac_linNLPModels.neval_jac_nlnNLPModels.neval_jac_residualNLPModels.neval_jconNLPModels.neval_jgradNLPModels.neval_jhessNLPModels.neval_jhess_residualNLPModels.neval_jhprodNLPModels.neval_jprodNLPModels.neval_jprod_linNLPModels.neval_jprod_nlnNLPModels.neval_jprod_residualNLPModels.neval_jtprodNLPModels.neval_jtprod_linNLPModels.neval_jtprod_nlnNLPModels.neval_jtprod_residualNLPModels.neval_objNLPModels.neval_residualNLPModels.nls_metaNLPModels.objNLPModels.objNLPModels.objconsNLPModels.objcons!NLPModels.objcons!NLPModels.objgradNLPModels.objgrad!NLPModels.objgrad!NLPModels.reset_data!NLPModels.residualNLPModels.residual!NLPModels.show_countersNLPModels.show_headerNLPModels.sparsitylineNLPModels.sum_countersNLPModels.sum_countersNLPModels.unconstrainedNLPModels.varscaleNLPModels.@default_countersNLPModels.@default_nlscountersNLPModels.@lencheckNLPModels.@rangecheck
NLPModels.AbstractNLPModel — TypeAbstractNLPModelBase type for an optimization model.
NLPModels.AbstractNLPModelMeta — TypeAbstractNLPModelMetaBase type for metadata related to an optimization model.
NLPModels.AbstractNLSModel — TypeAbstractNLSModel <: AbstractNLPModelBase type for a nonlinear least-squares model.
NLPModels.Counters — TypeCountersStruct for storing the number of function evaluations.
Counters()Creates an empty Counters struct.
NLPModels.DimensionError — TypeDimensionError <: Exception
DimensionError(name, dim_expected, dim_found)Error for unexpected dimension. Output: "DimensionError: Input name should have length dim_expected not dim_found"
NLPModels.NLPModelMeta — TypeNLPModelMeta <: AbstractNLPModelMetaA composite type that represents the main features of the optimization problem
optimize obj(x)
subject to lvar ≤ x ≤ uvar
lcon ≤ cons(x) ≤ uconwhere x is an nvar-dimensional vector, obj is the real-valued objective function, cons is the vector-valued constraint function, optimize is either "minimize" or "maximize".
Here, lvar, uvar, lcon and ucon are vectors. Some of their components may be infinite to indicate that the corresponding bound or general constraint is not present.
NLPModelMeta(nvar::Integer; kwargs...)
NLPModelMeta(meta::AbstractNLPModelMeta; kwargs...)Create an NLPModelMeta with nvar variables. Alternatively, create an NLPModelMeta copy from another AbstractNLPModelMeta. The following keyword arguments are accepted:
x0: initial guesslvar: vector of lower boundsuvar: vector of upper boundsnlvb: number of nonlinear variables in both objectives and constraintsnlvo: number of nonlinear variables in objectives (includes nlvb)nlvc: number of nonlinear variables in constraints (includes nlvb)ncon: number of general constraintsy0: initial Lagrange multiplierslcon: vector of constraint lower boundsucon: vector of constraint upper boundsnnzo: number of nonzeros in the gradientnnzj: number of elements needed to store the nonzeros in the sparse Jacobianlin_nnzj: number of elements needed to store the nonzeros in the sparse Jacobian of linear constraintsnln_nnzj: number of elements needed to store the nonzeros in the sparse Jacobian of nonlinear constraintsnnzh: number of elements needed to store the nonzeros in the sparse Hessianlin: indices of linear constraintsminimize: true if optimize == minimizeislp: true if the problem is a linear programname: problem name
NLPModelMeta also contains the following attributes, which are computed from the variables above:
nvar: number of variablesifix: indices of fixed variablesilow: indices of variables with lower bound onlyiupp: indices of variables with upper bound onlyirng: indices of variables with lower and upper bound (range)ifree: indices of free variablesiinf: indices of visibly infeasible boundsjfix: indices of equality constraintsjlow: indices of constraints of the form c(x) ≥ cljupp: indices of constraints of the form c(x) ≤ cujrng: indices of constraints of the form cl ≤ c(x) ≤ cujfree: indices of "free" constraints (there shouldn't be any)jinf: indices of the visibly infeasible constraintsnlin: number of linear constraintsnnln: number of nonlinear general constraintsnln: indices of nonlinear constraints
NLPModels.NLSCounters — TypeNLSCountersStruct for storing the number of functions evaluations for nonlinear least-squares models. NLSCounters also stores a Counters instance named counters.
NLSCounters()Creates an empty NLSCounters struct.
NLPModels.NLSMeta — TypeNLSMetaBase type for metadata related to a nonlinear least-squares model.
NLSMeta(nequ, nvar; kwargs...)Create a NLSMeta with nequ equations and nvar variables. The following keyword arguments are accepted:
x0: initial guessnnzj: number of elements needed to store the nonzeros of the Jacobian of the residualnnzh: number of elements needed to store the nonzeros of the sum of Hessians of the residualslin: indices of linear residuals
NLSMeta also contains the following attributes, which are computed from the variables above:
nequ: size of the residualnvar: number of variablesnln: indices of nonlinear residualsnnln: number of nonlinear general residualsnlin: number of linear residuals
Base.eltype — Methodeltype(nlp::AbstractNLPModel{T, S})Element type of nlp.meta.x0.
LinearOperators.reset! — Methodreset!(nlp)Reset evaluation count and model data (if appropriate) in nlp.
LinearOperators.reset! — Methodreset!(counters)Reset evaluation counters
NLPModels.bound_constrained — Methodbound_constrained(nlp)
bound_constrained(meta)Returns whether the problem has bounds on the variables and no other constraints.
NLPModels.cons! — Methodc = cons!(nlp, x, c)Evaluate $c(x)$, the constraints at x in place.
NLPModels.cons — Methodc = cons(nlp, x)Evaluate $c(x)$, the constraints at x.
NLPModels.cons_lin! — Functionc = cons_lin!(nlp, x, c)Evaluate the linear constraints at x in place.
NLPModels.cons_lin — Methodc = cons_lin(nlp, x)Evaluate the linear constraints at x.
NLPModels.cons_nln! — Functionc = cons_nln!(nlp, x, c)Evaluate the nonlinear constraints at x in place.
NLPModels.cons_nln — Methodc = cons_nln(nlp, x)Evaluate the nonlinear constraints at x.
NLPModels.conscale — Functionconscale(model::AbstractNLPModel)Return a vector of constraint scaling factors for the model. These are typically used to normalize constraints to have similar magnitudes and improve convergence behavior in nonlinear solvers.
NLPModels.coo_prod! — Methodcoo_prod!(rows, cols, vals, v, Av)Compute the product of a matrix A given by (rows, cols, vals) and the vector v. The result is stored in Av, which should have length equals to the number of rows of A.
NLPModels.coo_sym_prod! — Methodcoo_sym_prod!(rows, cols, vals, v, Av)Compute the product of a symmetric matrix A given by (rows, cols, vals) and the vector v. The result is stored in Av, which should have length equals to the number of rows of A. Only one triangle of A should be passed.
NLPModels.decrement! — Methoddecrement!(nlp, s)Decrement counter s of problem nlp.
NLPModels.equality_constrained — Methodequality_constrained(nlp)
equality_constrained(meta)Returns whether the problem's constraints are all equalities. Unconstrained problems return false.
NLPModels.get_ifix — Methodget_ifix(nlp)
get_ifix(meta)Return the value ifix from meta or nlp.meta.
NLPModels.get_ifree — Methodget_ifree(nlp)
get_ifree(meta)Return the value ifree from meta or nlp.meta.
NLPModels.get_iinf — Methodget_iinf(nlp)
get_iinf(meta)Return the value iinf from meta or nlp.meta.
NLPModels.get_ilow — Methodget_ilow(nlp)
get_ilow(meta)Return the value ilow from meta or nlp.meta.
NLPModels.get_irng — Methodget_irng(nlp)
get_irng(meta)Return the value irng from meta or nlp.meta.
NLPModels.get_islp — Methodget_islp(nlp)
get_islp(meta)Return the value islp from meta or nlp.meta.
NLPModels.get_iupp — Methodget_iupp(nlp)
get_iupp(meta)Return the value iupp from meta or nlp.meta.
NLPModels.get_jfix — Methodget_jfix(nlp)
get_jfix(meta)Return the value jfix from meta or nlp.meta.
NLPModels.get_jfree — Methodget_jfree(nlp)
get_jfree(meta)Return the value jfree from meta or nlp.meta.
NLPModels.get_jinf — Methodget_jinf(nlp)
get_jinf(meta)Return the value jinf from meta or nlp.meta.
NLPModels.get_jlow — Methodget_jlow(nlp)
get_jlow(meta)Return the value jlow from meta or nlp.meta.
NLPModels.get_jrng — Methodget_jrng(nlp)
get_jrng(meta)Return the value jrng from meta or nlp.meta.
NLPModels.get_jupp — Methodget_jupp(nlp)
get_jupp(meta)Return the value jupp from meta or nlp.meta.
NLPModels.get_lcon — Methodget_lcon(nlp)
get_lcon(meta)Return the value lcon from meta or nlp.meta.
NLPModels.get_lin — Methodget_lin(nlp)
get_lin(meta)Return the value lin from meta or nlp.meta.
NLPModels.get_lin — Methodget_lin(nls)
get_lin(nls_meta)Return the value lin from nls_meta or nls.nls_meta.
NLPModels.get_lin_nnzj — Methodget_lin_nnzj(nlp)
get_lin_nnzj(meta)Return the value lin_nnzj from meta or nlp.meta.
NLPModels.get_lvar — Methodget_lvar(nlp)
get_lvar(meta)Return the value lvar from meta or nlp.meta.
NLPModels.get_minimize — Methodget_minimize(nlp)
get_minimize(meta)Return the value minimize from meta or nlp.meta.
NLPModels.get_name — Methodget_name(nlp)
get_name(meta)Return the value name from meta or nlp.meta.
NLPModels.get_ncon — Methodget_ncon(nlp)
get_ncon(meta)Return the value ncon from meta or nlp.meta.
NLPModels.get_nequ — Methodget_nequ(nls)
get_nequ(nls_meta)Return the value nequ from nls_meta or nls.nls_meta.
NLPModels.get_nlin — Methodget_nlin(nlp)
get_nlin(meta)Return the value nlin from meta or nlp.meta.
NLPModels.get_nlin — Methodget_nlin(nls)
get_nlin(nls_meta)Return the value nlin from nls_meta or nls.nls_meta.
NLPModels.get_nln — Methodget_nln(nlp)
get_nln(meta)Return the value nln from meta or nlp.meta.
NLPModels.get_nln — Methodget_nln(nls)
get_nln(nls_meta)Return the value nln from nls_meta or nls.nls_meta.
NLPModels.get_nln_nnzj — Methodget_nln_nnzj(nlp)
get_nln_nnzj(meta)Return the value nln_nnzj from meta or nlp.meta.
NLPModels.get_nlvb — Methodget_nlvb(nlp)
get_nlvb(meta)Return the value nlvb from meta or nlp.meta.
NLPModels.get_nlvc — Methodget_nlvc(nlp)
get_nlvc(meta)Return the value nlvc from meta or nlp.meta.
NLPModels.get_nlvo — Methodget_nlvo(nlp)
get_nlvo(meta)Return the value nlvo from meta or nlp.meta.
NLPModels.get_nnln — Methodget_nnln(nlp)
get_nnln(meta)Return the value nnln from meta or nlp.meta.
NLPModels.get_nnln — Methodget_nnln(nls)
get_nnln(nls_meta)Return the value nnln from nls_meta or nls.nls_meta.
NLPModels.get_nnzh — Methodget_nnzh(nlp)
get_nnzh(meta)Return the value nnzh from meta or nlp.meta.
NLPModels.get_nnzh — Methodget_nnzh(nls)
get_nnzh(nls_meta)Return the value nnzh from nls_meta or nls.nls_meta.
NLPModels.get_nnzj — Methodget_nnzj(nlp)
get_nnzj(meta)Return the value nnzj from meta or nlp.meta.
NLPModels.get_nnzj — Methodget_nnzj(nls)
get_nnzj(nls_meta)Return the value nnzj from nls_meta or nls.nls_meta.
NLPModels.get_nnzo — Methodget_nnzo(nlp)
get_nnzo(meta)Return the value nnzo from meta or nlp.meta.
NLPModels.get_nvar — Methodget_nvar(nlp)
get_nvar(meta)Return the value nvar from meta or nlp.meta.
NLPModels.get_nvar — Methodget_nvar(nls)
get_nvar(nls_meta)Return the value nvar from nls_meta or nls.nls_meta.
NLPModels.get_ucon — Methodget_ucon(nlp)
get_ucon(meta)Return the value ucon from meta or nlp.meta.
NLPModels.get_uvar — Methodget_uvar(nlp)
get_uvar(meta)Return the value uvar from meta or nlp.meta.
NLPModels.get_x0 — Methodget_x0(nlp)
get_x0(meta)Return the value x0 from meta or nlp.meta.
NLPModels.get_x0 — Methodget_x0(nls)
get_x0(nls_meta)Return the value x0 from nls_meta or nls.nls_meta.
NLPModels.get_y0 — Methodget_y0(nlp)
get_y0(meta)Return the value y0 from meta or nlp.meta.
NLPModels.ghjvprod! — Functionghjvprod!(nlp, x, g, v, gHv)Return the vector whose i-th component is gᵀ ∇²cᵢ(x) v in place.
NLPModels.ghjvprod — MethodgHv = ghjvprod(nlp, x, g, v)Return the vector whose i-th component is gᵀ ∇²cᵢ(x) v.
NLPModels.grad! — Functiong = grad!(nlp, x, g)Evaluate $∇f(x)$, the gradient of the objective function at x in place.
NLPModels.grad! — Methodg = grad!(nls, x, g)
g = grad!(nls, x, g, Fx; recompute::Bool=true)Evaluate ∇f(x), the gradient of the objective function of nls::AbstractNLSModel at x in place. Fx is overwritten with the value of the residual F(x). If recompute is true, then Fx is updated with the residual at x.
NLPModels.grad — Methodg = grad(nlp, x)Evaluate $∇f(x)$, the gradient of the objective function at x.
NLPModels.has_bounds — Methodhas_bounds(nlp)
has_bounds(meta)Returns whether the problem has bounds on the variables.
NLPModels.has_equalities — Methodhas_equalities(nlp)Returns whether the problem has constraints and at least one of them is an equality. Unconstrained problems return false.
NLPModels.has_inequalities — Methodhas_inequalities(nlp)Returns whether the problem has constraints and at least one of them is an inequality. Unconstrained problems return false.
NLPModels.hess — MethodHx = hess(nlp, x, y; obj_weight=1.0)Evaluate the Lagrangian Hessian at (x,y) as a sparse matrix, with objective function scaled by obj_weight, i.e.,
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight . A Symmetric object wrapping the lower triangle is returned.
NLPModels.hess — MethodHx = hess(nlp, x; obj_weight=1.0)Evaluate the objective Hessian at x as a sparse matrix, with objective function scaled by obj_weight, i.e.,
\[σ ∇²f(x),\]
with σ = obj_weight . A Symmetric object wrapping the lower triangle is returned.
NLPModels.hess_coord! — Functionvals = hess_coord!(nlp, x, y, vals; obj_weight=1.0)Evaluate the Lagrangian Hessian at (x,y) in sparse coordinate format, with objective function scaled by obj_weight, i.e.,
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight , overwriting vals. Only the lower triangle is returned.
NLPModels.hess_coord! — Methodvals = hess_coord!(nlp, x, vals; obj_weight=1.0)Evaluate the objective Hessian at x in sparse coordinate format, with objective function scaled by obj_weight, i.e.,
\[σ ∇²f(x),\]
with σ = obj_weight , overwriting vals. Only the lower triangle is returned.
NLPModels.hess_coord — Methodvals = hess_coord(nlp, x, y; obj_weight=1.0)Evaluate the Lagrangian Hessian at (x,y) in sparse coordinate format, with objective function scaled by obj_weight, i.e.,
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight . Only the lower triangle is returned.
NLPModels.hess_coord — Methodvals = hess_coord(nlp, x; obj_weight=1.0)Evaluate the objective Hessian at x in sparse coordinate format, with objective function scaled by obj_weight, i.e.,
\[σ ∇²f(x),\]
with σ = obj_weight . Only the lower triangle is returned.
NLPModels.hess_coord_residual! — Functionvals = hess_coord_residual!(nls, x, v, vals)Computes the linear combination of the Hessians of the residuals at x with coefficients v in sparse coordinate format, rewriting vals.
NLPModels.hess_coord_residual — Methodvals = hess_coord_residual(nls, x, v)Computes the linear combination of the Hessians of the residuals at x with coefficients v in sparse coordinate format.
NLPModels.hess_op! — MethodH = hess_op!(nlp, x, y, Hv; obj_weight=1.0)Return the Lagrangian Hessian at (x,y) with objective function scaled by obj_weight as a linear operator, and storing the result on Hv. The resulting object may be used as if it were a matrix, e.g., w = H * v. The vector Hv is used as preallocated storage for the operation. The linear operator H represents
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight .
NLPModels.hess_op! — MethodH = hess_op!(nlp, x, Hv; obj_weight=1.0)Return the objective Hessian at x with objective function scaled by obj_weight as a linear operator, and storing the result on Hv. The resulting object may be used as if it were a matrix, e.g., w = H * v. The vector Hv is used as preallocated storage for the operation. The linear operator H represents
\[σ ∇²f(x),\]
with σ = obj_weight .
NLPModels.hess_op! — MethodH = hess_op!(nlp, rows, cols, vals, Hv)Return the Hessian given by (rows, cols, vals) as a linear operator, and storing the result on Hv. The resulting object may be used as if it were a matrix, e.g., w = H * v. The vector Hv is used as preallocated storage for the operation. The linear operator H represents
\[σ ∇²f(x),\]
with σ = obj_weight .
NLPModels.hess_op — MethodH = hess_op(nlp, x, y; obj_weight=1.0)Return the Lagrangian Hessian at (x,y) with objective function scaled by obj_weight as a linear operator. The resulting object may be used as if it were a matrix, e.g., H * v. The linear operator H represents
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight .
NLPModels.hess_op — MethodH = hess_op(nlp, x; obj_weight=1.0)Return the objective Hessian at x with objective function scaled by obj_weight as a linear operator. The resulting object may be used as if it were a matrix, e.g., H * v. The linear operator H represents
\[σ ∇²f(x),\]
with σ = obj_weight .
NLPModels.hess_op_residual! — MethodHop = hess_op_residual!(nls, x, i, Hiv)Computes the Hessian of the i-th residual at x, in linear operator form. The vector Hiv is used as preallocated storage for the operation.
NLPModels.hess_op_residual — MethodHop = hess_op_residual(nls, x, i)Computes the Hessian of the i-th residual at x, in linear operator form.
NLPModels.hess_residual — MethodH = hess_residual(nls, x, v)Computes the linear combination of the Hessians of the residuals at x with coefficients v. A Symmetric object wrapping the lower triangle is returned.
NLPModels.hess_structure! — Functionhess_structure!(nlp, rows, cols)Return the structure of the Lagrangian Hessian in sparse coordinate format in place.
NLPModels.hess_structure — Method(rows,cols) = hess_structure(nlp)Return the structure of the Lagrangian Hessian in sparse coordinate format.
NLPModels.hess_structure_residual! — Functionhess_structure_residual!(nls, rows, cols)Returns the structure of the residual Hessian in place.
NLPModels.hess_structure_residual — Method(rows,cols) = hess_structure_residual(nls)Returns the structure of the residual Hessian.
NLPModels.histline — Methodhistline(s, v, maxv)Return a string of the form
______NAME______: ████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 5where:
______NAME______isswith padding to the left and length 16.- And the symbols █ and ⋅ fill 20 characters in the proportion of
v / maxvto █ and the rest to ⋅. - The number
5is v.
NLPModels.hprod! — FunctionHv = hprod!(nlp, x, y, v, Hv; obj_weight=1.0)Evaluate the product of the Lagrangian Hessian at (x,y) with the vector v in place, with objective function scaled by obj_weight, where the Lagrangian Hessian is
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight .
NLPModels.hprod! — MethodHv = hprod!(nlp, rows, cols, vals, v, Hv)Evaluate the product of the objective or Lagrangian Hessian given by (rows, cols, vals) in triplet format with the vector v in place. Only one triangle of the Hessian should be given.
NLPModels.hprod! — MethodHv = hprod!(nlp, x, v, Hv; obj_weight=1.0)Evaluate the product of the objective Hessian at x with the vector v in place, with objective function scaled by obj_weight, where the objective Hessian is
\[σ ∇²f(x),\]
with σ = obj_weight .
NLPModels.hprod — MethodHv = hprod(nlp, x, y, v; obj_weight=1.0)Evaluate the product of the Lagrangian Hessian at (x,y) with the vector v, with objective function scaled by obj_weight, where the Lagrangian Hessian is
\[∇²L(x,y) = σ ∇²f(x) + \sum_i yᵢ ∇²cᵢ(x),\]
with σ = obj_weight .
NLPModels.hprod — MethodHv = hprod(nlp, x, v; obj_weight=1.0)Evaluate the product of the objective Hessian at x with the vector v, with objective function scaled by obj_weight, where the objective Hessian is
\[σ ∇²f(x),\]
with σ = obj_weight .
NLPModels.hprod_residual! — FunctionHiv = hprod_residual!(nls, x, i, v, Hiv)Computes the product of the Hessian of the i-th residual at x, times the vector v, and stores it in vector Hiv.
NLPModels.hprod_residual — MethodHiv = hprod_residual(nls, x, i, v)Computes the product of the Hessian of the i-th residual at x, times the vector v.
NLPModels.increment! — Methodincrement!(nlp, s)Increment counter s of problem nlp.
NLPModels.increment! — Methodincrement!(nls, s)Increment counter s of problem nls.
NLPModels.inequality_constrained — Methodinequality_constrained(nlp)
inequality_constrained(meta)Returns whether the problem's constraints are all inequalities. Unconstrained problems return true.
NLPModels.jac — MethodJx = jac(nlp, x)Evaluate $J(x)$, the constraints Jacobian at x as a sparse matrix.
NLPModels.jac_coord! — Methodvals = jac_coord!(nlp, x, vals)Evaluate $J(x)$, the constraints Jacobian at x in sparse coordinate format, rewriting vals.
NLPModels.jac_coord — Methodvals = jac_coord(nlp, x)Evaluate $J(x)$, the constraints Jacobian at x in sparse coordinate format.
NLPModels.jac_coord_residual! — Functionvals = jac_coord_residual!(nls, x, vals)Computes the Jacobian of the residual at x in sparse coordinate format, rewriting vals. rows and cols are not rewritten.
NLPModels.jac_coord_residual — Method(rows,cols,vals) = jac_coord_residual(nls, x)Computes the Jacobian of the residual at x in sparse coordinate format.
NLPModels.jac_lin — MethodJx = jac_lin(nlp)Evaluate the linear constraints Jacobian as a sparse matrix.
NLPModels.jac_lin_coord! — Functionvals = jac_lin_coord!(nlp, vals)Evaluate the linear constraints Jacobian in sparse coordinate format, overwriting vals.
NLPModels.jac_lin_coord — Methodvals = jac_lin_coord(nlp)Evaluate the linear constraints Jacobian in sparse coordinate format.
NLPModels.jac_lin_op! — MethodJ = jac_lin_op!(nlp, Jv, Jtv)Return the linear Jacobian as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v. The values Jv and Jtv are used as preallocated storage for the operations.
NLPModels.jac_lin_op! — MethodJ = jac_lin_op!(nlp, rows, cols, vals, Jv, Jtv)Return the linear Jacobian given by (rows, cols, vals) as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v. The values Jv and Jtv are used as preallocated storage for the operations.
NLPModels.jac_lin_op — MethodJ = jac_lin_op(nlp)Return the linear Jacobian as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v.
NLPModels.jac_lin_structure! — Functionjac_lin_structure!(nlp, rows, cols)Return the structure of the linear constraints Jacobian in sparse coordinate format in place.
NLPModels.jac_lin_structure — Method(rows,cols) = jac_lin_structure(nlp)Return the structure of the linear constraints Jacobian in sparse coordinate format.
NLPModels.jac_nln — MethodJx = jac_nln(nlp, x)Evaluate $J(x)$, the nonlinear constraints Jacobian at x as a sparse matrix.
NLPModels.jac_nln_coord! — Functionvals = jac_nln_coord!(nlp, x, vals)Evaluate $J(x)$, the nonlinear constraints Jacobian at x in sparse coordinate format, overwriting vals.
NLPModels.jac_nln_coord — Methodvals = jac_nln_coord(nlp, x)Evaluate $J(x)$, the nonlinear constraints Jacobian at x in sparse coordinate format.
NLPModels.jac_nln_op! — MethodJ = jac_nln_op!(nlp, rows, cols, vals, Jv, Jtv)Return the nonlinear Jacobian given by (rows, cols, vals) as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v. The values Jv and Jtv are used as preallocated storage for the operations.
NLPModels.jac_nln_op! — MethodJ = jac_nln_op!(nlp, x, Jv, Jtv)Return the nonlinear Jacobian at x as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v. The values Jv and Jtv are used as preallocated storage for the operations.
NLPModels.jac_nln_op — MethodJ = jac_nln_op(nlp, x)Return the nonlinear Jacobian at x as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v.
NLPModels.jac_nln_structure! — Functionjac_nln_structure!(nlp, rows, cols)Return the structure of the nonlinear constraints Jacobian in sparse coordinate format in place.
NLPModels.jac_nln_structure — Method(rows,cols) = jac_nln_structure(nlp)Return the structure of the nonlinear constraints Jacobian in sparse coordinate format.
NLPModels.jac_op! — MethodJ = jac_op!(nlp, rows, cols, vals, Jv, Jtv)Return the Jacobian given by (rows, cols, vals) as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v. The values Jv and Jtv are used as preallocated storage for the operations.
NLPModels.jac_op! — MethodJ = jac_op!(nlp, x, Jv, Jtv)Return the Jacobian at x as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v. The values Jv and Jtv are used as preallocated storage for the operations.
NLPModels.jac_op — MethodJ = jac_op(nlp, x)Return the Jacobian at x as a linear operator. The resulting object may be used as if it were a matrix, e.g., J * v or J' * v.
NLPModels.jac_op_residual! — MethodJx = jac_op_residual!(nls, x, Jv, Jtv)Computes $J(x)$, the Jacobian of the residual at x, in linear operator form. The vectors Jv and Jtv are used as preallocated storage for the operations.
NLPModels.jac_op_residual! — MethodJx = jac_op_residual!(nls, rows, cols, vals, Jv, Jtv)Computes $J(x)$, the Jacobian of the residual given by (rows, cols, vals), in linear operator form. The vectors Jv and Jtv are used as preallocated storage for the operations.
NLPModels.jac_op_residual — MethodJx = jac_op_residual(nls, x)Computes $J(x)$, the Jacobian of the residual at x, in linear operator form.
NLPModels.jac_residual — MethodJx = jac_residual(nls, x)Computes $J(x)$, the Jacobian of the residual at x.
NLPModels.jac_structure! — Methodjac_structure!(nlp, rows, cols)Return the structure of the constraints Jacobian in sparse coordinate format in place.
NLPModels.jac_structure — Method(rows,cols) = jac_structure(nlp)Return the structure of the constraints Jacobian in sparse coordinate format.
NLPModels.jac_structure_residual! — Function(rows,cols) = jac_structure_residual!(nls, rows, cols)Returns the structure of the constraint's Jacobian in sparse coordinate format in place.
NLPModels.jac_structure_residual — Method(rows,cols) = jac_structure_residual(nls)Returns the structure of the constraint's Jacobian in sparse coordinate format.
NLPModels.jprod! — MethodJv = jprod!(nlp, x, v, Jv)Evaluate $J(x)v$, the Jacobian-vector product at x in place.
NLPModels.jprod! — MethodJv = jprod!(nlp, rows, cols, vals, v, Jv)Evaluate $J(x)v$, the Jacobian-vector product, where the Jacobian is given by (rows, cols, vals) in triplet format.
NLPModels.jprod — MethodJv = jprod(nlp, x, v)Evaluate $J(x)v$, the Jacobian-vector product at x.
NLPModels.jprod_lin! — FunctionJv = jprod_lin!(nlp, v, Jv)Evaluate $J(x)v$, the linear Jacobian-vector product at x in place.
NLPModels.jprod_lin! — MethodJv = jprod_lin!(nlp, rows, cols, vals, v, Jv)Evaluate $J(x)v$, the linear Jacobian-vector product, where the Jacobian is given by (rows, cols, vals) in triplet format.
NLPModels.jprod_lin — MethodJv = jprod_lin(nlp, v)Evaluate $J(x)v$, the linear Jacobian-vector product at x.
NLPModels.jprod_nln! — FunctionJv = jprod_nln!(nlp, x, v, Jv)Evaluate $J(x)v$, the nonlinear Jacobian-vector product at x in place.
NLPModels.jprod_nln! — MethodJv = jprod_nln!(nlp, rows, cols, vals, v, Jv)Evaluate $J(x)v$, the nonlinear Jacobian-vector product, where the Jacobian is given by (rows, cols, vals) in triplet format.
NLPModels.jprod_nln — MethodJv = jprod_nln(nlp, x, v)Evaluate $J(x)v$, the nonlinear Jacobian-vector product at x.
NLPModels.jprod_residual! — FunctionJv = jprod_residual!(nls, x, v, Jv)Computes the product of the Jacobian of the residual at x and a vector, i.e., $J(x)v$, storing it in Jv.
NLPModels.jprod_residual! — MethodJv = jprod_residual!(nls, rows, cols, vals, v, Jv)Computes the product of the Jacobian of the residual given by (rows, cols, vals) and a vector, i.e., $J(x)v$, storing it in Jv.
NLPModels.jprod_residual — MethodJv = jprod_residual(nls, x, v)Computes the product of the Jacobian of the residual at x and a vector, i.e., $J(x)v$.
NLPModels.jth_hess — MethodHx = jth_hess(nlp, x, j)Evaluate the Hessian of j-th constraint at x as a sparse matrix with the same sparsity pattern as the Lagrangian Hessian. A Symmetric object wrapping the lower triangle is returned.
NLPModels.jth_hess_coord! — Functionvals = jth_hess_coord!(nlp, x, j, vals)Evaluate the Hessian of j-th constraint at x in sparse coordinate format, with vals of length nlp.meta.nnzh, in place. Only the lower triangle is returned.
NLPModels.jth_hess_coord — Methodvals = jth_hess_coord(nlp, x, j)Evaluate the Hessian of j-th constraint at x in sparse coordinate format. Only the lower triangle is returned.
NLPModels.jth_hess_residual — MethodHj = jth_hess_residual(nls, x, j)Computes the Hessian of the j-th residual at x.
NLPModels.jth_hess_residual_coord! — Methodvals = jth_hess_residual_coord!(nls, x, j, vals)Evaluate the Hessian of j-th residual at x in sparse coordinate format, with vals of length nls.nls_meta.nnzh, in place. Only the lower triangle is returned.
NLPModels.jth_hess_residual_coord — Methodvals = jth_hess_residual_coord(nls, x, j)Evaluate the Hessian of j-th residual at x in sparse coordinate format. Only the lower triangle is returned.
NLPModels.jth_hprod! — FunctionHv = jth_hprod!(nlp, x, v, j, Hv)Evaluate the product of the Hessian of j-th constraint at x with the vector v in place.
NLPModels.jth_hprod — MethodHv = jth_hprod(nlp, x, v, j)Evaluate the product of the Hessian of j-th constraint at x with the vector v.
NLPModels.jtprod! — MethodJtv = jtprod!(nlp, x, v, Jtv)Evaluate $J(x)^Tv$, the transposed-Jacobian-vector product at x in place. If the problem has linear and nonlinear constraints, this function allocates.
NLPModels.jtprod! — MethodJtv = jtprod!(nlp, rows, cols, vals, v, Jtv)Evaluate $J(x)^Tv$, the transposed-Jacobian-vector product, where the Jacobian is given by (rows, cols, vals) in triplet format.
NLPModels.jtprod — MethodJtv = jtprod(nlp, x, v)Evaluate $J(x)^Tv$, the transposed-Jacobian-vector product at x.
NLPModels.jtprod_lin! — FunctionJtv = jtprod_lin!(nlp, v, Jtv)Evaluate $J(x)^Tv$, the linear transposed-Jacobian-vector product at x in place.
NLPModels.jtprod_lin! — MethodJtv = jtprod_lin!(nlp, rows, cols, vals, v, Jtv)Evaluate $J(x)^Tv$, the linear transposed-Jacobian-vector product, where the Jacobian is given by (rows, cols, vals) in triplet format.
NLPModels.jtprod_lin — MethodJtv = jtprod_lin(nlp, v)Evaluate $J(x)^Tv$, the linear transposed-Jacobian-vector product at x.
NLPModels.jtprod_nln! — FunctionJtv = jtprod_nln!(nlp, x, v, Jtv)Evaluate $J(x)^Tv$, the nonlinear transposed-Jacobian-vector product at x in place.
NLPModels.jtprod_nln! — MethodJtv = jtprod_nln!(nlp, rows, cols, vals, v, Jtv)Evaluate $J(x)^Tv$, the nonlinear transposed-Jacobian-vector product, where the Jacobian is given by (rows, cols, vals) in triplet format.
NLPModels.jtprod_nln — MethodJtv = jtprod_nln(nlp, x, v)Evaluate $J(x)^Tv$, the nonlinear transposed-Jacobian-vector product at x.
NLPModels.jtprod_residual! — FunctionJtv = jtprod_residual!(nls, x, v, Jtv)Computes the product of the transpose of the Jacobian of the residual at x and a vector, i.e., $J(x)^Tv$, storing it in Jtv.
NLPModels.jtprod_residual! — MethodJtv = jtprod_residual!(nls, rows, cols, vals, v, Jtv)Computes the product of the transpose of the Jacobian of the residual given by (rows, cols, vals) and a vector, i.e., $J(x)^Tv$, storing it in Jv.
NLPModels.jtprod_residual — MethodJtv = jtprod_residual(nls, x, v)Computes the product of the transpose of the Jacobian of the residual at x and a vector, i.e., $J(x)^Tv$.
NLPModels.lagscale — Functionlagscale(model::AbstractNLPModel)Return a vector of scaling factors for the Lagrange multipliers associated with constraints. This can be used to improve numerical stability or condition number when solving KKT systems.
NLPModels.linearly_constrained — Methodlinearly_constrained(nlp)
linearly_constrained(meta)Returns whether the problem's constraints are known to be all linear.
NLPModels.lines_of_description — Methodlines_of_description(meta)Describe meta for the show function.
NLPModels.lines_of_description — Methodlines_of_description(nls_meta)Describe nls_meta for the show function.
NLPModels.lines_of_hist — Methodlines_of_hist(S, V)Return a vector of histline(s, v, maxv)s using pairs of s in S and v in V. maxv is given by the maximum of V.
NLPModels.neval_cons — Methodneval_cons(nlp)Get the number of cons evaluations.
NLPModels.neval_cons_lin — Methodneval_cons_lin(nlp)Get the number of cons evaluations.
NLPModels.neval_cons_nln — Methodneval_cons_nln(nlp)Get the number of cons evaluations.
NLPModels.neval_grad — Methodneval_grad(nlp)Get the number of grad evaluations.
NLPModels.neval_hess — Methodneval_hess(nlp)Get the number of hess evaluations.
NLPModels.neval_hess_residual — Methodneval_hess_residual(nlp)Get the number of hess evaluations.
NLPModels.neval_hprod — Methodneval_hprod(nlp)Get the number of hprod evaluations.
NLPModels.neval_hprod_residual — Methodneval_hprod_residual(nlp)Get the number of hprod evaluations.
NLPModels.neval_jac — Methodneval_jac(nlp)Get the number of jac evaluations.
NLPModels.neval_jac_lin — Methodneval_jac_lin(nlp)Get the number of jac evaluations.
NLPModels.neval_jac_nln — Methodneval_jac_nln(nlp)Get the number of jac evaluations.
NLPModels.neval_jac_residual — Methodneval_jac_residual(nlp)Get the number of jac evaluations.
NLPModels.neval_jcon — Methodneval_jcon(nlp)Get the number of jcon evaluations.
NLPModels.neval_jgrad — Methodneval_jgrad(nlp)Get the number of jgrad evaluations.
NLPModels.neval_jhess — Methodneval_jhess(nlp)Get the number of jhess evaluations.
NLPModels.neval_jhess_residual — Methodneval_jhess_residual(nlp)Get the number of jhess evaluations.
NLPModels.neval_jhprod — Methodneval_jhprod(nlp)Get the number of jhprod evaluations.
NLPModels.neval_jprod — Methodneval_jprod(nlp)Get the number of jprod evaluations.
NLPModels.neval_jprod_lin — Methodneval_jprod_lin(nlp)Get the number of jprod evaluations.
NLPModels.neval_jprod_nln — Methodneval_jprod_nln(nlp)Get the number of jprod evaluations.
NLPModels.neval_jprod_residual — Methodneval_jprod_residual(nlp)Get the number of jprod evaluations.
NLPModels.neval_jtprod — Methodneval_jtprod(nlp)Get the number of jtprod evaluations.
NLPModels.neval_jtprod_lin — Methodneval_jtprod_lin(nlp)Get the number of jtprod evaluations.
NLPModels.neval_jtprod_nln — Methodneval_jtprod_nln(nlp)Get the number of jtprod evaluations.
NLPModels.neval_jtprod_residual — Methodneval_jtprod_residual(nlp)Get the number of jtprod evaluations.
NLPModels.neval_obj — Methodneval_obj(nlp)Get the number of obj evaluations.
NLPModels.neval_residual — Methodneval_residual(nlp)Get the number of residual evaluations.
NLPModels.nls_meta — Methodnls_meta(nls)Returns the nls_meta structure of nls. Use this instead of nls.nls_meta to handle models that have internal models.
For basic models nls_meta(nls) is defined as nls.nls_meta, but composite models might not keep nls_meta themselves, so they might specialize it to something like nls.internal.nls_meta.
NLPModels.obj — Functionf = obj(nlp, x)Evaluate $f(x)$, the objective function of nlp at x.
NLPModels.obj — Methodf = obj(nls, x)
f = obj(nls, x, Fx; recompute::Bool=true)Evaluate f(x), the objective function of nls::AbstractNLSModel. Fx is overwritten with the value of the residual F(x). If recompute is true, then Fx is updated with the residual at x.
NLPModels.objcons! — Methodf, c = objcons!(nlp, x, c)Evaluate $f(x)$ and $c(x)$ at x. c is overwritten with the value of $c(x)$.
NLPModels.objcons! — Methodf, c = objcons!(nls, x, c)
f, c = objcons!(nls, x, c, Fx; recompute::Bool=true)In-place evaluation of constraints and objective for AbstractNLSModel. Fx is overwritten with the value of the residual F(x). If recompute is true, then Fx is updated with the residual at x.
NLPModels.objcons — Methodf, c = objcons(nlp, x)Evaluate $f(x)$ and $c(x)$ at x.
NLPModels.objgrad! — Methodf, g = objgrad!(nlp, x, g)Evaluate $f(x)$ and $∇f(x)$ at x. g is overwritten with the value of $∇f(x)$.
NLPModels.objgrad! — Methodf, g = objgrad!(nls, x, g)
f, g = objgrad!(nls, x, g, Fx; recompute::Bool=true)Evaluate f(x) and ∇f(x) of nls::AbstractNLSModel at x. Fx is overwritten with the value of the residual F(x). If recompute is true, then Fx is updated with the residual at x.
NLPModels.objgrad — Methodf, g = objgrad(nlp, x)Evaluate $f(x)$ and $∇f(x)$ at x.
NLPModels.reset_data! — Methodreset_data!(nlp)Reset model data if appropriate. This method should be overloaded if a subtype of AbstractNLPModel contains data that should be reset, such as a quasi-Newton linear operator.
NLPModels.residual! — FunctionFx = residual!(nls, x, Fx)Computes $F(x)$, the residual at x.
NLPModels.residual — MethodFx = residual(nls, x)Computes $F(x)$, the residual at x.
NLPModels.show_counters — Methodshow_counters(io, counters, fields)Show the fields of the struct counters.
NLPModels.show_header — Methodshow_header(io, nlp)Show a header for the specific nlp type. Should be imported and defined for every model implementing the NLPModels API.
NLPModels.sparsityline — Methodsparsityline(s, v, maxv)Return a string of the form
______NAME______: ( 80.00% sparsity) 5where:
______NAME______isswith padding to the left and length 16.- The sparsity value is given by
v / maxv. - The number
5is v.
NLPModels.sum_counters — Methodsum_counters(nlp)Sum all counters of problem nlp except cons, jac, jprod and jtprod.
NLPModels.sum_counters — Methodsum_counters(counters)Sum all counters of counters except cons, jac, jprod and jtprod.
NLPModels.unconstrained — Methodunconstrained(nlp)
unconstrained(meta)Returns whether the problem in unconstrained.
NLPModels.varscale — Functionvarscale(model::AbstractNLPModel)Return a vector containing the scaling factors for each variable in the model. This is typically used to normalize variables for numerical stability in solvers.
By default, the scaling is model-dependent. If not overridden by the model, a vector of ones is returned. Inspired by the AMPL scaling conventions.
NLPModels.@default_counters — Macro@default_counters Model innerDefine functions relating counters of Model to counters of Model.inner.
NLPModels.@default_nlscounters — Macro@default_nlscounters Model innerDefine functions relating NLS counters of Model to NLS counters of Model.inner.
NLPModels.@lencheck — Macro@lencheck n x y z …Check that arrays x, y, z, etc. have a prescribed length n.
NLPModels.@rangecheck — Macro@rangecheck ℓ u i j k …Check that values i, j, k, etc. are in the range [ℓ,u].