KTH /
Engineering Science
/
Mathematics
/
Optimization and Systems Theory
SF3840 Numerical nonlinear programming, 7.5 cr
Below is given a brief summary of the contents of past lectures in the
course, with some pointers to relevant parts of the lecture notes. It
is not intended to be complete, but rather to assist students that may
have missed some lectures.
In addition, a brief summary of coming lectures is given.
Brief summary of past lectures
Lecture 1, May 10.
Introduction to the course.
Lecture 2, May 18.
Definition of local and global minimizer. Optimality conditions for
unconstrained minimization. Convergence, limit points, convergence
rate, open set, closed set, Taylor series expansions, vector norms,
matrix norms.
Proof of convergence rate for Newton's
method (Section 3.14). Homework assignment 1 was
made available.
Lecture 3, May 25.
Introduction to linesearch methods. Definition of sufficient decrease
in linesearch. Goldstein-Armijo and Wolfe linesearch
conditions. Convergence proof for a first-order linesearch method with
Goldstein-Armijo type linesearch. Definition of sufficient descent
direction (Section 3.10).
Lecture 4, June 1.
Trust-region methods. Second-order linesearch methods.
Homework assignment 2 was made available (on June 2).
Lecture 5, June 8.
Conjugate-gradient methods. Behavior on quadratics.
Lecture 6, June 15.
Quasi-Newton methods. Behavior on quadratics. Symmetric rank-1
update. BFGS update. Cholesky factorization. BFGS update as a product
of Cholesky factors. Convergence properties of quasi-Newton methods.
Lecture 7, June 22.
Fundamental theory for nonlinear equality-constrained
problems. Feasible arcs. First- and second-order optimality
conditions for equality-constrained optimization. Optimality
conditions viewed as nonlinear equations.
Homework assignment 3,
homework assignment 4 and
homework assignment 5 were made available.
Brief summary of planned lecture
Lecture 8
The KKT matrix and its inertial properties. Penalty function methods,
two-norm and one-norm penalty. Augmented Lagrangian function
methods. Sequential quadratic programming methods. Properties of the
augmented Lagrangian function. Equivalence of augmented Lagrangian
functions and shifted penalty functions.
Brief summary of planned self-study lectures
Lecture 9
First-order optimality conditions for inequality-constrained problems. The
separation theorem and Farkas' lemma.
Second-order optimality conditions for inequality-constrained
problems. Illustration on nonconvex quadratic program.
Lecture 10
Active-set methods and interior methods for quadratic
programming. Specialization to linear programming. The simplex method.
Lecture 11
Methods for inequality-constrained nonlinear
problems. Sequential-quadratic programming methods, sequential
linearly constrained programming methods, augmented Lagrangian
methods.
Lecture 12
Interior methods, primal and primal-dual. Merit functions for interior
methods, agumented barrier merit function. Inertial properties and
relationship to primal-dual search direction for such a merit
function. Augmented Lagrangian merit function for inequality
constrained optimization.
|