Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Second-derivative sequential quadratic programming methods for nonlinear optimization

Abstract

Sequential Quadratic Programming (SQP) methods are a popular and successful class of methods for minimizing a generally nonlinear function subject to nonlinear constraints. Under a standard set of assumptions, conventional SQP methods exhibit a fast rate of local convergence. However, in practice, a conventional SQP method involves solving an indefinite quadratic program (QP), which is NP hard in general. As a result, approximations to the second-derivatives are often used, which can slow the rate of local convergence and reduce the chance that the algorithm will converge to a local minimizer instead of a saddle point. In addition, the standard assumptions required for convergence often do not hold in practice. For such problems, regularized SQP methods, which also require second-derivatives, have been shown to have good local convergence properties; however, there are few regularized SQP methods that exhibit convergence to a minimizer from an arbitrary initial starting point. This thesis considers the formulation, analysis and implementation of SQP methods with the following properties. (i) The solution of an indefinite QP is not required. (ii) Regularization is performed in such a way that global convergence can be established under standard assumptions. (iii) Implementations of the method work well on degenerate problems

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View