Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Projected-search methods for box-constrained optimization

Abstract

Many algorithms used in unconstrained minimization are line-search methods. Given an initial point x and function f : Rn [arrow] R to be minimized, a line-search method repeatedly solves two subproblems : the first calculates a search direction p; the second performs a line search on the function [phi]([alpha]) = f(x + [alpha]p). Then, [alpha]p is added to x and the process is repeated until a solution is located. Quasi-Newton methods are often used to calculate the search direction. A quasi-Newton method creates a quadratic model of f at x and defines the search direction p such that x + p is the minimizer of the model. After each iteration the model is updated to more closely resemble f near x. Line searches seek to satisfy conditions that ensure the convergence of the sequence of iterates. One step that decreases f "sufficiently" is called an Armijo step. A Wolfe step satisfies stronger conditions that impose bounds on [phi]([alpha]). Quasi- Newton methods perform significantly better when using Wolfe steps. Recently Gill and Leonard proposed the reduced Hessian (RH) method, which is a new quasi-Newton method for unconstrained optimization. This method exploits key structures in the quadratic model so that the dimension of the search space is reduced. Placing box constraints x leads to more complex problems. One method for solving such problems is the projected-search method. This method performs an unconstrained minimization on a changing subset of the variables and projects points that violate the constraints back into the feasible region while performing the line search. To date, projected line- search methods have been restricted to using an Armijo- like line search. By modifying the line-search conditions, we create a new projected line search that uses a Wolfe- like step. This line search retains many of the benefits of a Wolfe line search for the unconstrained case. Projected-search methods and RH methods share a similar structure in solving for the search direction. We exploit this similarity and merge the two ideas to create a class of RH methods for box-constrained optimization. When combined with the new line search, this new family of algorithms minimizes problems in less than 74% of the time taken by the leading comparable alternative on a collection of standard test problems

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View