Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Iterative methods for large-scale unconstrained optimization

Abstract

An unconstrained minimizer of a general nonlinear function may be found by solving a sequence of constrained subproblems in which a quadratic model function is minimized subject to a "trust-region" constraint on the norm of the change in variables. For the large-scale case, Steihaug has proposed an iterative method for the constrained subproblem based on the preconditioned conjugate-gradient (PCG) method. This method is terminated inside the trust region at an approximate minimizer or at the point where the iterates cross the trust-region boundary. When the iterates are terminated at the trust- region boundary, the final iterate is generally an inaccurate solution of the constrained subproblem. This may have an adverse affect on the efficiency and robustness of the overall trust-region method. A PCG-based method is proposed that may be used to solve the trust- region subproblem to any prescribed accuracy. The method starts by using a modified Steihaug method. If the solution lies on the trust-region boundary, a PCG-based sequential subspace minimization (SSM) method is used to solve the constrained problem over a sequence of evolving low-dimensional subspaces. A new regularized sequential Newton method is used to define basis vectors for the subspace minimization. Several preconditioners are proposed for the PCG iterations. Numerical results suggest that, in general, a trust-region method based on the proposed solver is more robust and requires fewer function evaluations than Steihaug's method

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View