pips(f_fcn,
x0=None,
A=None,
l=None,
u=None,
xmin=None,
xmax=None,
gh_fcn=None,
hess_fcn=None,
opt=None)
| source code
|
Primal-dual interior point method for NLP (nonlinear programming).
Minimize a function F(X) beginning from a starting point x0, subject to optional linear and nonlinear constraints
and variable bounds:
min f(x)
x
subject to:
g(x) = 0 (nonlinear equalities)
h(x) <= 0 (nonlinear inequalities)
l <= A*x <= u (linear constraints)
xmin <= x <= xmax (variable bounds)
Note: The calling syntax is almost identical to that of FMINCON from
MathWorks' Optimization Toolbox. The main difference is that the linear
constraints are specified with A , L ,
U instead of A , B ,
Aeq , Beq . The functions for evaluating the
objective function, constraints and Hessian are identical.
Example from http://en.wikipedia.org/wiki/Nonlinear_programming:
>>> from numpy import array, r_, float64, dot
>>> from scipy.sparse import csr_matrix
>>> def f2(x):
... f = -x[0] * x[1] - x[1] * x[2]
... df = -r_[x[1], x[0] + x[2], x[1]]
...
... d2f = -array([[0, 1, 0], [1, 0, 1], [0, 1, 0]], float64)
... return f, df, d2f
>>> def gh2(x):
... h = dot(array([[1, -1, 1],
... [1, 1, 1]]), x**2) + array([-2.0, -10.0])
... dh = 2 * csr_matrix(array([[ x[0], x[0]],
... [-x[1], x[1]],
... [ x[2], x[2]]]))
... g = array([])
... dg = None
... return h, g, dh, dg
>>> def hess2(x, lam, cost_mult=1):
... mu = lam["ineqnonlin"]
... a = r_[dot(2 * array([1, 1]), mu), -1, 0]
... b = r_[-1, dot(2 * array([-1, 1]), mu),-1]
... c = r_[0, -1, dot(2 * array([1, 1]), mu)]
... Lxx = csr_matrix(array([a, b, c]))
... return Lxx
>>> x0 = array([1, 1, 0], float64)
>>> solution = pips(f2, x0, gh_fcn=gh2, hess_fcn=hess2)
>>> round(solution["f"], 11) == -7.07106725919
True
>>> solution["output"]["iterations"]
8
Ported by Richard Lincoln from the MATLAB Interior Point Solver (MIPS)
(v1.9) by Ray Zimmerman. MIPS is distributed as part of the MATPOWER
project, developed at the Power System Engineering Research Center
(PSERC) (PSERC), Cornell. See http://www.pserc.cornell.edu/matpower/ for more info.
MIPS was ported by Ray Zimmerman from C code written by H. Wang for his
PhD dissertation:
-
"On the Computation and Application of Multi-period
Security-Constrained Optimal Power Flow for Real-time Electricity
Market Operations", Cornell University, May 2007.
See also:
-
H. Wang, C. E. Murillo-Sanchez, R. D. Zimmerman, R. J. Thomas,
"On Computational Issues of Market-Based Optimal Power
Flow", IEEE Transactions on Power Systems, Vol. 22, No. 3, Aug.
2007, pp. 1185-1193.
All parameters are optional except f_fcn and
x0 .
- Parameters:
f_fcn (callable) - Function that evaluates the objective function, its gradients and
Hessian for a given value of x. If there are
nonlinear constraints, the Hessian information is provided by the
'hess_fcn' argument and is not required here.
x0 (array) - Starting value of optimization vector x.
A (csr_matrix) - Optional linear constraints.
l (array) - Optional linear constraints. Default values are -Inf.
u (array) - Optional linear constraints. Default values are Inf.
xmin (array) - Optional lower bounds on the x variables,
defaults are -Inf.
xmax (array) - Optional upper bounds on the x variables,
defaults are Inf.
gh_fcn (callable) - Function that evaluates the optional nonlinear constraints and
their gradients for a given value of x.
hess_fcn (callable) - Handle to function that computes the Hessian of the Lagrangian
for given values of x, lambda and mu, where lambda and mu are the
multipliers on the equality and inequality constraints, g and h, respectively.
opt (dict) - optional options dictionary with the following keys, all of which
are also optional (default values shown in parentheses)
-
verbose (False) - Controls level of progress
output displayed
-
feastol (1e-6) - termination tolerance for
feasibility condition
-
gradtol (1e-6) - termination tolerance for
gradient condition
-
comptol (1e-6) - termination tolerance for
complementarity condition
-
costtol (1e-6) - termination tolerance for cost
condition
-
max_it (150) - maximum number of iterations
-
step_control (False) - set to True to enable
step-size control
-
max_red (20) - maximum number of step-size
reductions if step-control is on
-
cost_mult (1.0) - cost multiplier used to scale
the objective function for improved conditioning. Note: This
value is also passed as the 3rd argument to the Hessian
evaluation function so that it can appropriately scale the
objective function term in the Hessian of the Lagrangian.
- Returns: dict
- The solution dictionary has the following keys:
-
x - solution vector
-
f - final objective function value
-
converged - exit status
-
True = first order optimality conditions satisfied
-
False = maximum number of iterations reached
-
None = numerically failed
-
output - output dictionary with keys:
-
iterations - number of iterations performed
-
hist - list of arrays with trajectories of
the following: feascond, gradcond, compcond, costcond,
gamma, stepsize, obj, alphap, alphad
-
message - exit message
-
lmbda - dictionary containing the Langrange and
Kuhn-Tucker multipliers on the constraints, with keys:
-
eqnonlin - nonlinear equality constraints
-
ineqnonlin - nonlinear inequality
constraints
-
mu_l - lower (left-hand) limit on linear
constraints
-
mu_u - upper (right-hand) limit on linear
constraints
-
lower - lower bound on optimization
variables
-
upper - upper bound on optimization
variables
|