RegularizedOptimization.jl

This package implements a family of algorithms to solve nonsmooth optimization problems of the form

\[\underset{x \in \mathbb{R}^n}{\text{minimize}} \quad f(x) + h(x), \quad \text{subject to} \ c(x) = 0.\]

where $f: \mathbb{R}^n \to \mathbb{R}$ and $c: \mathbb{R}^n \to \mathbb{R}^m$ are continuously differentiable, and $h: \mathbb{R}^n \to \mathbb{R} \cup \{+\infty\}$ is lower semi-continuous. The nonsmooth objective $h$ can be a regularizer such as a sparsity-inducing penalty, model simple constraints such as $x$ belonging to a simple convex set, or be a combination of both. All $f$, $h$ and $c$ can be nonconvex.

All solvers implemented in this package are JuliaSmoothOptimizers-compliant. They take a RegularizedNLPModel as input and return a GenericExecutionStats.

A RegularizedNLPModel contains:

  • a smooth component f represented as an AbstractNLPModel,
  • a nonsmooth regularizer h.

For the smooth component f, we refer to jso.dev for tutorials on the NLPModels API. This framework allows the usage of models from

We refer to ManualNLPModels.jl for users interested in defining their own model.

For the nonsmooth component h we refer to ShiftedProximalOperators.jl. Regularizers are available in


How to Install

RegularizedOptimization can be installed through the Julia package manager:

julia> ]
pkg> add https://github.com/JuliaSmoothOptimizers/RegularizedOptimization.jl

Bug reports and discussions

If you think you found a bug, please open an issue. Focused suggestions and requests can also be opened as issues. Before opening a pull request, we recommend starting an issue or a discussion first.

For general questions not suited for a bug report, feel free to start a discussion here. This forum is for questions and discussions about any of the JuliaSmoothOptimizers packages.