Researchers: Anders Forsgren, in cooperation with Philip E. Gill (UCSD).
Sponsor: The Swedish Research Council (VR).
The goal of this project is the development of computationally efficient methods for solving large sparse nonlinear optimization problems. We focus on methods that utilize second-derivatives, since we expect such methods to prove more robust and efficient than methods that only use first-derivative information. Recent research has been directed towards interior methods for nonlinear optimization, in particular linear algebra issues related to such methods, concerning factorzation methods as well as iterative methods.