[an error occurred while processing this directive]


Kungl Tekniska högskolan / Optimization and Systems Theory /

[an error occurred while processing this directive]This is a printer-friendly version of (none)



SF3840 Numerical nonlinear programming, 7.5 cr

Below is given a brief summary of the contents of past and coming lectures in the course, with some pointers to relevant parts of the lecture notes. It is not intended to be complete, but rather to assist students that may have missed some lectures.

In addition, a brief summary of coming lectures is given.

Brief summary of past lectures

Lecture 1, January 23. Introduction to the course.

Lecture 2, January 30. Definition of local and global minimizer. Optimality conditions for unconstrained minimization. Convergence, limit points, convergence rate, open set, closed set, Taylor series expansions, vector norms, matrix norms. Proof of convergence rate for Newton's method (Section 2.3). Homework assignment 1 was made available.

Lecture 3. February 6. Introduction to linesearch methods. Definition of sufficient decrease in linesearch. Goldstein-Armijo and Wolfe linesearch conditions. Convergence proof for a first-order linesearch method with Goldstein-Armijo type linesearch. Definition of sufficient descent direction (Sections 3.6 and 3.7).

Lecture 4. February 13. Trust-region methods. Second-order linesearch methods. Homework assignment 2 was made available (February 14).

Lecture 5. February 20. Conjugate-gradient methods. Behavior on quadratics. The presentation was based on "own setup". An alternative presentation can be found in a report by myself and Tove Odland, On solving symmetric systems of linear equations in an unnormalized Krylov subspace framework, arXiv:1409.4937 [math.OC], 2014.

Lecture 6. February 27. Quasi-Newton methods derived as methods equivalent to the method of conjugate gradients on quadratics. Symmetric rank-1 update. BFGS update. Convergence properties of quasi-Newton methods. Homework assignment 3 was made available.

Lecture 7. March 20. Fundamental theory for nonlinear equality-constrained problems. Feasible arcs. First- and second-order optimality conditions for equality-constrained optimization. Optimality conditions viewed as nonlinear equations.

Lecture 8. March 27. The KKT matrix and its inertial properties. Penalty function methods, two-norm and one-norm penalty. Augmented Lagrangian function methods. Sequential quadratic programming methods. Properties of the augmented Lagrangian function. Equivalence of augmented Lagrangian functions and shifted penalty functions.

Lecture 9. April 10. First-order optimality conditions for inequality-constrained problems. The separation theorem and Farkas' lemma. Second-order optimality conditions for inequality-constrained problems. Homework assignment 4 and homework assignment 5 were made available (on April 13).

Lecture 10. April 17. Active-set methods and interior methods for quadratic programming. Specialization to linear programming. The simplex method.

Brief summary of planned lectures

Lecture 11. April 24. Methods for inequality-constrained nonlinear problems. Sequential-quadratic programming methods, sequential linearly constrained programming methods, augmented Lagrangian methods.

Lecture 12. May 8. Interior methods, primal and primal-dual. Merit functions for interior methods, agumented barrier merit function. Inertial properties and relationship to primal-dual search direction for such a merit function. Augmented Lagrangian merit function for inequality constrained optimization.


Optimization and Systems Theory, KTH
Anders Forsgren, andersf@kth.se