API

Auxiliary

SolverTools.activeFunction
active(x, ℓ, u; rtol = 1e-8, atol = 1e-8)

Computes the active bounds at x, using tolerance min(rtol * (uᵢ-ℓᵢ), atol). If ℓᵢ or uᵢ is not finite, only atol is used.

source
SolverTools.breakpointsFunction
nbrk, brkmin, brkmax = breakpoints(x, d, ℓ, u)

Find the smallest and largest values of α such that x + αd lies on the boundary. x is assumed to be feasible. nbrk is the number of breakpoints from x in the direction d.

source
SolverTools.log_headerFunction
log_header(colnames, coltypes)

Creates a header using the names in colnames formatted according to the types in coltypes. Uses internal formatting specification given by SolverTools.formats and default header translation given by SolverTools.default_headers.

Input:

  • colnames::Vector{Symbol}: Column names.
  • coltypes::Vector{DataType}: Column types.

Keyword arguments:

  • hdr_override::Dict{Symbol,String}: Overrides the default headers.
  • colsep::Int: Number of spaces between columns (Default: 2)

See also log_row.

source
SolverTools.log_rowFunction
log_row(vals)

Creates a table row from the values on vals according to their types. Pass the names and types of vals to log_header for a logging table. Uses internal formatting specification given by SolverTools.formats.

Keyword arguments:

  • colsep::Int: Number of spaces between columns (Default: 2)
source
SolverTools.project!Function
project!(y, x, ℓ, u)

Projects x into bounds and u, in the sense of yᵢ = max(ℓᵢ, min(xᵢ, uᵢ)).

source

Benchmarking

Main Benchmarking Functions

SolverTools.bmark_solversFunction
bmark_solvers(solvers :: Dict{Symbol,Any}, args...; kwargs...)

Run a set of solvers on a set of problems.

Arguments

  • solvers: a dictionary of solvers to which each problem should be passed
  • other positional arguments accepted by solve_problems(), except for a solver name

Keyword arguments

Any keyword argument accepted by solve_problems()

Return value

A Dict{Symbol, AbstractExecutionStats} of statistics.

source
SolverTools.solve_problemsFunction
solve_problems(solver, problems :: Any; kwargs...)

Apply a solver to a set of problems.

Arguments

  • solver: the function name of a solver
  • problems: the set of problems to pass to the solver, as an iterable of AbstractNLPModel. It is recommended to use a generator expression (necessary for CUTEst problems).

Keyword arguments

  • solver_logger::AbstractLogger: logger wrapping the solver call. (default: NullLogger).
  • skipif::Function: function to be applied to a problem and return whether to skip it (default: x->false)
  • prune: do not include skipped problems in the final statistics (default: true)
  • any other keyword argument to be passed to the solver

Return value

  • a DataFrame where each row is a problem, minus the skipped ones if prune is true.
source

Utilities

SolverTools.save_statsFunction
save_stats(stats, filename; kwargs...)

Write the benchmark statistics stats to a file named filename.

Arguments

  • stats::Dict{Symbol,DataFrame}: benchmark statistics such as returned by bmark_solvers()
  • filename::AbstractString: the output file name.

Keyword arguments

  • force::Bool=false: whether to overwrite filename if it already exists
  • key::String="stats": the key under which the data can be read from filename later.

Return value

This method returns an error if filename exists and force==false. On success, it returns the value of jldopen(filename, "w").

source
SolverTools.load_statsFunction
stats = load_stats(filename; kwargs...)

Arguments

  • filename::AbstractString: the input file name.

Keyword arguments

  • key::String="stats": the key under which the data can be read in filename. The key should be the same as the one used when save_stats() was called.

Return value

A Dict{Symbol,DataFrame} containing the statistics stored in file filename. The user should import DataFrames before calling load_stats().

source
SolverTools.count_uniqueFunction
vals = count_unique(X)

Count the number of occurrences of each value in X.

Arguments

  • X: an iterable.

Return value

A Dict{eltype(X),Int} whose keys are the unique elements in X and values are their number of occurrences.

Example: the snippet

stats = load_stats("mystats.jld2")
for solver ∈ keys(stats)
  @info "$solver statuses" count_unique(stats[solver].status)
end

displays the number of occurrences of each final status for each solver in stats.

source
SolverTools.quick_summaryFunction
statuses, avgs = quick_summary(stats; kwargs...)

Call count_unique() and compute a few average measures for each solver in stats.

Arguments

  • stats::Dict{Symbol,DataFrame}: benchmark statistics such as returned by bmark_solvers().

Keyword arguments

  • cols::Vector{Symbol}: symbols indicating DataFrame columns in solver statistics for which we compute averages. Default: [:iter, :neval_obj, :neval_grad, :neval_hess, :neval_hprod, :elapsed_time].

Return value

  • statuses::Dict{Symbol,Dict{Symbol,Int}}: a dictionary of number of occurrences of each final status for each solver in stats. Each value in this dictionary is returned by count_unique()
  • avgs::Dict{Symbol,Dict{Symbol,Float64}}: a dictionary that contains averages of performance measures across all problems for each solver. Each avgs[solver] is a Dict{Symbol,Float64} where the measures are those given in the keyword argument cols and values are averages of those measures across all problems.

Example: the snippet

statuses, avgs = quick_summary(stats)
for solver ∈ keys(stats)
  @info "statistics for" solver statuses[solver] avgs[solver]
end

displays quick summary and averages for each solver.

source

Line-Search

SolverTools.LineModelType

A type to represent the restriction of a function to a direction. Given f : R → Rⁿ, x ∈ Rⁿ and a nonzero direction d ∈ Rⁿ,

ϕ = LineModel(nlp, x, d)

represents the function ϕ : R → R defined by

ϕ(t) := f(x + td).
source
NLPModels.objFunction

obj(f, t) evaluates the objective of the LineModel

ϕ(t) := f(x + td).
source
NLPModels.gradFunction

grad(f, t) evaluates the first derivative of the LineModel

ϕ(t) := f(x + td),

i.e.,

ϕ'(t) = ∇f(x + td)ᵀd.
source
NLPModels.grad!Function

grad!(f, t, g) evaluates the first derivative of the LineModel

ϕ(t) := f(x + td),

i.e.,

ϕ'(t) = ∇f(x + td)ᵀd.

The gradient ∇f(x + td) is stored in g.

source
NLPModels.hessFunction

Evaluate the second derivative of the LineModel

ϕ(t) := f(x + td),

i.e.,

ϕ"(t) = dᵀ∇²f(x + td)d.
source
SolverTools.armijo_wolfeFunction
t, good_grad, ht, nbk, nbW = armijo_wolfe(h, h₀, slope, g)

Performs a line search from x along the direction d as defined by the LineModel $h(t) = f(x + t d)$, where h₀ = h(0) = f(x), slope = h'(0) = ∇f(x)ᵀd and g is a vector that will be overwritten with the gradient at various points. On exit, if good_grad=true, g contains the gradient at the final step length. The steplength is chosen trying to satisfy the Armijo and Wolfe conditions. The Armijo condition is

\[h(t) ≤ h₀ + τ₀ t h'(0)\]

and the Wolfe condition is

\[h'(t) ≤ τ₁ h'(0).\]

Initially the step is increased trying to satisfy the Wolfe condition. Afterwards, only backtracking is performed in order to try to satisfy the Armijo condition. The final steplength may only satisfy Armijo's condition.

The output is the following:

  • t: the step length;
  • good_grad: whether g is the gradient at x + t * d;
  • ht: the model value at t, i.e., f(x + t * d);
  • nbk: the number of times the steplength was decreased to satisfy the Armijo condition, i.e., number of backtracks;
  • nbW: the number of times the steplength was increased to satisfy the Wolfe condition.

The following keyword arguments can be provided:

  • t: starting steplength (default 1);
  • τ₀: slope factor in the Armijo condition (default max(1e-4, √ϵₘ));
  • τ₁: slope factor in the Wolfe condition. It should satisfy τ₁ > τ₀ (default 0.9999);
  • bk_max: maximum number of backtracks (default 10);
  • bW_max: maximum number of increases (default 5);
  • verbose: whether to print information (default false).
source

Stats

SolverTools.GenericExecutionStatsType
GenericExecutionStats(status, nlp; ...)

A GenericExecutionStats is a struct for storing output information of solvers. It contains the following fields:

  • status: Indicates the output of the solver. Use show_statuses() for the full list;
  • solution: The final approximation returned by the solver (default: []);
  • objective: The objective value at solution (default: Inf);
  • dual_feas: The dual feasibility norm at solution (default: Inf);
  • primal_feas: The primal feasibility norm at solution (default: 0.0 if uncontrained, Inf otherwise);
  • multipliers: The Lagrange multiplers wrt to the constraints (default: []);
  • multipliers_L: The Lagrange multiplers wrt to the lower bounds on the variables (default: []);
  • multipliers_U: The Lagrange multiplers wrt to the upper bounds on the variables (default: []);
  • iter: The number of iterations computed by the solver (default: -1);
  • elapsed_time: The elapsed time computed by the solver (default: Inf);
  • counters::NLPModels.NLSCounters: The Internal structure storing the number of functions evaluations;
  • solver_specific::Dict{Symbol,Any}: A solver specific dictionary.

The counters variable is a copy of nlp's counters, and status is mandatory on construction. All other variables can be input as keyword arguments.

Notice that GenericExecutionStats does not compute anything, it simply stores.

source

Trust-Region

SolverTools.AbstractTrustRegionType

AbstractTrustRegion

An abstract trust region type so that specific trust regions define update rules differently. Child types must have at least the following fields:

  • acceptance_threshold :: AbstractFloat
  • initial_radius :: AbstractFloat
  • radius :: AbstractFloat
  • ratio :: AbstractFloat

and the following function:

  • update!(tr, step_norm)
source
SolverTools.aredpredFunction

ared, pred = aredpred(nlp, f, f_trial, Δm, x_trial, step, slope)

Compute the actual and predicted reductions ∆f and Δm, where Δf = f_trial - f is the actual reduction is an objective/merit/penalty function, Δm = m_trial - m is the reduction predicted by the model m of f. We assume that m is being minimized, and therefore that Δm < 0.

source
SolverTools.get_propertyFunction

A basic getter for AbstractTrustRegion instances. Should be overhauled when it's possible to overload getfield() and setfield!(). See https://github.com/JuliaLang/julia/issues/1974

source
SolverTools.update!Function

update!(tr, step_norm)

Update the trust-region radius based on the ratio of actual vs. predicted reduction and the step norm.

source