FE501 – NLP Summary

Convex and Concave Functions

Single variable case

(1)xS,f(x)0f(x) is convexxS,f(x)0f(x) is concave

 

Multiple variable case

(2)H=[adcb]PM1(1)=a,PM1(2)=b,PM2=a×bc×d
(3)(x1,,xn)S,PMk0f(x1,,xn) is convex(x1,,xn)S,(1)kPMk0f(x1,,xn) is concave

 

Solving NLP with one variables

(4)max (min)f(x)s.t.x[a,b]
  1. Step 1 : find all stationary points.
  2. Find other points where f’(x) dose not exist.
  3. Check values of z at endpoints. (a,b, )
(5)iff(x0)=0,f(x0)<0,a<x0<bx0is local maxiff(x0)=0,f(x0)>0,a<x0<bx0is local min

Unconstrained NLPs with multiple variables

(6)max (ormin)f(x1,,xn)s.t.(x1,,xn)R

First find all local extremum points (x¯) :

(7)let f=0 to get x¯if Hk(x¯)>0x¯ is a local minif (1)kHk(x¯)>0x¯ is a local max

Constrained NLPs with multiple variables

Equality (Lagrangian)

(8)max (ormin)f(x1,,xn)s.t.h1(x1,,xn)=b1h2(x1,,xn)=b2hm(x1,,xn)=bm

Write Lagrangian function

(9)minL=f+i=1mλi(hibi)maxL=f+i=1mλi(bihi)

Solve

(10)Lxi=0Lλ=0

Inequality (KKT)

(11)max (ormin)f(x1,,xn)s.t.g1(x1,,xn)=b1g2(x1,,xn)≤=b2gm(x1,,xn)bm

Then write Lagrangian function

(12)minL=f+i=1mμi(gibi)maxL=f+i=1mμi(bigi)

Then write KKT conditions and solve:

  1. stationary . f/xi=0
  2. original. gibi
  3. Complementary μ(gibi)=0 (min)
  4. Dual. μ0

 

Important Notice from Hocam:

In contrained NLPs you cannot check the type of the candidate point by evaluating the Hessian matrix as we do in unconstrained optimization. We determined the local maximality of the point by evaluating some neighboring points. There is a method called restricted Lagrangean, but beyond the material we cover in this class