开源软件名称: JuliaSmoothOptimizers/NLPModels.jl开源软件地址: https://github.com/JuliaSmoothOptimizers/NLPModels.jl开源编程语言:
Julia
100.0%
开源软件介绍: NLPModels
Documentation
CI
Coverage
Release
DOI
This package provides general guidelines to represent optimization problems in Julia and a standardized API to evaluate the functions and their derivatives.
The main objective is to be able to rely on that API when designing optimization solvers in Julia.
How to Cite
If you use NLPModels.jl in your work, please cite using the format given in CITATION.bib .
Optimization Problems
Optimization problems are represented by an instance of (a subtype of) AbstractNLPModel
.
Such instances are composed of
an instance of NLPModelMeta
, which provides information about the problem, including the number of variables, constraints, bounds on the variables, etc.
other data specific to the provenance of the problem.
See the
documentation for
details on the models and the API.
Installation
Models
This package provides no models, although it allows the definition of manually written models.
Check the list of packages that define models in this page of the docs
Main Methods
If model
is an instance of an appropriate subtype of AbstractNLPModel
, the following methods are normally defined:
obj(model, x)
: evaluate f(x) , the objective at x
cons(model x)
: evaluate c(x) , the vector of general constraints at x
The following methods are defined if first-order derivatives are available:
grad(model, x)
: evaluate ∇f(x) , the objective gradient at x
jac(model, x)
: evaluate J(x) , the Jacobian of c at x
as a sparse matrix
If Jacobian-vector products can be computed more efficiently than by evaluating the Jacobian explicitly, the following methods may be implemented:
jprod(model, x, v)
: evaluate the result of the matrix-vector product J(x)⋅v
jtprod(model, x, u)
: evaluate the result of the matrix-vector product J(x)ᵀ⋅u
The following method is defined if second-order derivatives are available:
hess(model, x, y)
: evaluate ∇²L(x,y) , the Hessian of the Lagrangian at x
and y
If Hessian-vector products can be computed more efficiently than by evaluating the Hessian explicitly, the following method may be implemented:
hprod(model, x, v, y)
: evaluate the result of the matrix-vector product ∇²L(x,y)⋅v
Several in-place variants of the methods above may also be implemented.
The complete list of methods that an interface may implement can be found in the documentation.
Attributes
NLPModelMeta
objects have the following attributes (with S <: AbstractVector
):
Attribute
Type
Notes
nvar
Int
number of variables
x0
S
initial guess
lvar
S
vector of lower bounds
uvar
S
vector of upper bounds
ifix
Vector{Int}
indices of fixed variables
ilow
Vector{Int}
indices of variables with lower bound only
iupp
Vector{Int}
indices of variables with upper bound only
irng
Vector{Int}
indices of variables with lower and upper bound (range)
ifree
Vector{Int}
indices of free variables
iinf
Vector{Int}
indices of visibly infeasible bounds
ncon
Int
total number of general constraints
nlin
Int
number of linear constraints
nnln
Int
number of nonlinear general constraints
y0
S
initial Lagrange multipliers
lcon
S
vector of constraint lower bounds
ucon
S
vector of constraint upper bounds
lin
Vector{Int}
indices of linear constraints
nln
Vector{Int}
indices of nonlinear constraints
jfix
Vector{Int}
indices of equality constraints
jlow
Vector{Int}
indices of constraints of the form c(x) ≥ cl
jupp
Vector{Int}
indices of constraints of the form c(x) ≤ cu
jrng
Vector{Int}
indices of constraints of the form cl ≤ c(x) ≤ cu
jfree
Vector{Int}
indices of "free" constraints (there shouldn't be any)
jinf
Vector{Int}
indices of the visibly infeasible constraints
nnzo
Int
number of nonzeros in the gradient
nnzj
Int
number of nonzeros in the sparse Jacobian
nnzh
Int
number of nonzeros in the sparse Hessian
minimize
Bool
true if optimize == minimize
islp
Bool
true if the problem is a linear program
name
String
problem name
请发表评论