Models

There are currently three models implemented in this package, besides the external ones.

ADNLPModel

ADNLPModel is an AbstractNLPModel using ForwardDiff to computer the derivatives. In this interface, the objective function $f$ and an initial estimate are required. If there are constraints, the function $c:\mathbb{R}^n\rightarrow\mathbb{R}^m$ and the vectors $c_L$ and $c_U$ also need to be passed. Bounds on the variables and an inital estimate to the Lagrangian multipliers can also be provided.

ADNLPModel(f, x0; lvar = [-∞,…,-∞], uvar = [∞,…,∞], y0=zeros,
  c = NotImplemented, lcon = [-∞,…,-∞], ucon = [∞,…,∞], name = "Generic")
  • f :: Function - The objective function $f$;

  • x0 :: Vector - The initial point of the problem;

  • lvar :: Vector - $\ell$, the lower bound of the variables;

  • uvar :: Vector - $u$, the upper bound of the variables;

  • c :: Function - The constraints function $c$;

  • y0 :: Vector - The initial value of the Lagrangian estimates;

  • lcon :: Vector - $c_L$, the lower bounds of the constraints function;

  • ucon :: Vector - $c_U$, the upper bounds of the constraints function;

  • name :: String - A name for the model.

The functions follow the same restrictions of ForwardDiff functions, summarised here:

  • The function can only be composed of generic Julia functions;

  • The function must accept only one argument;

  • The function's argument must accept a subtype of Vector;

  • The function should be type-stable.

For contrained problems, the function $c$ is required, and it must return an array even when m = 1, and $c_L$ and $c_U$ should be passed, otherwise the problem is ill-formed. For equality constraints, the corresponding index of $c_L$ and $c_U$ should be the same.

source

Example

using NLPModels
f(x) = sum(x.^4)
x = [1.0; 0.5; 0.25; 0.125]
nlp = ADNLPModel(f, x)
grad(nlp, x)
4-element Array{Float64,1}:
 4.0
 0.5
 0.0625
 0.0078125

MathProgNLPModel

Construct a MathProgNLPModel from a MathProgModel.

source

Construct a MathProgNLPModel from a JuMP Model.

source

Example

using NLPModels, MathProgBase, JuMP
m = Model()
@variable(m, x[1:4])
@NLobjective(m, Min, sum(x[i]^4 for i=1:4))
nlp = MathProgNLPModel(m)
x0 = [1.0; 0.5; 0.25; 0.125]
grad(nlp, x0)
4-element Array{Float64,1}:
 4.0
 0.5
 0.0625
 0.0078125

SimpleNLPModel

SimpleNLPModel is an AbstractNLPModel that uses only user-defined functions. In this interface, the objective function $f$ and an initial estimate are required. If the user wants to use derivatives, they need to be passed. The same goes for the Hessian and Hessian-Vector product. For constraints, $c:\mathbb{R}^n\rightarrow\mathbb{R}^m$ and the vectors $c_L$ and $c_U$ also need to be passed. Bounds on the variables and an inital estimate to the Lagrangian multipliers can also be provided. The user can also pass the Jacobian and the Lagrangian Hessian and Hessian-Vector product.

SimpleNLPModel(f, x0; lvar = [-∞,…,-∞], uvar = [∞,…,∞], y0=zeros,
  lcon = [-∞,…,-∞], ucon = [∞,…,∞], name = "Generic",
  [list of functions])
  • f :: Function - The objective function $f$;

  • x0 :: Vector - The initial point of the problem;

  • lvar :: Vector - $\ell$, the lower bound of the variables;

  • uvar :: Vector - $u$, the upper bound of the variables;

  • y0 :: Vector - The initial value of the Lagrangian estimates;

  • lcon :: Vector - $c_L$, the lower bounds of the constraints function;

  • ucon :: Vector - $c_U$, the upper bounds of the constraints function;

  • name :: String - A name for the model.

All functions passed have a direct correlation with a NLP function. You don't have to define any more than you need, but calling an undefined function will throw a NotImplementedError. The list is

  • g and g!: $\nabla f(x)$, the gradient of the objective function; see grad.

    gx = g(x) gx = g!(x, gx)

  • H: The lower triangle of the Hessian of the objective function or of the Lagrangian; see hess.

    Hx = H(x; obj_weight=1.0) # if the problem is unconstrained Hx = H(x; obj_weight=1.0, y=zeros) # if the problem is constrained

  • Hcoord - The lower triangle of the Hessian of the objective function or of the Lagrangian, in triplet format; see hess_coord.

    (rows,cols,vals) = Hcoord(x; obj_weight=1.0) # if the problem is unconstrained (rows,cols,vals) = Hcoord(x; obj_weight=1.0, y=zeros) # if the problem is constrained

  • Hp and Hp! - The product of the Hessian of the objective function or of the Lagrangian by a vector; see hprod.

    Hv = Hp(x, v, obj_weight=1.0) # if the problem is unconstrained Hv = Hp!(x, v, Hv, obj_weight=1.0) # if the problem is unconstrained Hv = Hp(x, v, obj_weight=1.0, y=zeros) # if the problem is constrained Hv = Hp!(x, v, Hv, obj_weight=1.0, y=zeros) # if the problem is constrained

  • c and c! - $c(x)$, the constraints function; see cons.

    cx = c(x) cx = c!(x, cx)

  • J - $J(x)$, the Jacobian of the constraints; see jac.

    Jx = J(x)

  • Jcoord - $J(x)$, the Jacobian of the constraints, in triplet format; see jac_coord.

    (rows,cols,vals) = Jcoord(x)

  • Jp and Jp! - The Jacobian-vector product; see jprod.

    Jv = Jp(x, v) Jv = Jp!(x, v, Jv)

  • Jtp and Jtp! - The Jacobian-transposed-vector product; see jtprod.

    Jtv = Jtp(x, v) Jtv = Jtp!(x, v, Jtv)

For contrained problems, the function $c$ is required, and it must return an array even when m = 1, and $c_L$ and $c_U$ should be passed, otherwise the problem is ill-formed. For equality constraints, the corresponding index of $c_L$ and $c_U$ should be the same.

source

Example

using NLPModels
f(x) = sum(x.^4)
g(x) = 4*x.^3
x = [1.0; 0.5; 0.25; 0.125]
nlp = SimpleNLPModel(f, x, g=g)
grad(nlp, x)
4-element Array{Float64,1}:
 4.0
 0.5
 0.0625
 0.0078125

SlackModel

A model whose only inequality constraints are bounds.

Given a model, this type represents a second model in which slack variables are introduced so as to convert linear and nonlinear inequality constraints to equality constraints and bounds. More precisely, if the original model has the form

\[ \min f(x) \mbox{ s. t. } c_L \leq c(x) \leq c_U \mbox{ and } \ell \leq x \leq u, \]

the new model appears to the user as

\[ \min f(X) \mbox{ s. t. } g(X) = 0 \mbox{ and } L \leq X \leq U. \]

The unknowns $X = (x, s)$ contain the original variables and slack variables $s$. The latter are such that the new model has the general form

\[ \min f(x) \mbox{ s. t. } c(x) - s = 0, c_L \leq s \leq c_U \mbox{ and } \ell \leq x \leq u, \]

although no slack variables are introduced for equality constraints.

The slack variables are implicitly ordered as [s(low), s(upp), s(rng)], where low, upp and rng represent the indices of the constraints of the form $c_L \leq c(x) < \infty$, $-\infty < c(x) \leq c_U$ and $c_L \leq c(x) \leq c_U$, respectively.

source

Example

using NLPModels
f(x) = x[1]^2 + 4x[2]^2
c(x) = [x[1]*x[2] - 1]
x = [2.0; 2.0]
nlp = ADNLPModel(f, x, c=c, lcon=[0.0])
nlp_slack = SlackModel(nlp)
nlp_slack.meta.lvar
3-element Array{Float64,1}:
 -Inf
 -Inf
    0.0