Sequential quadratic programming (SQP) methods are a popular class of methods for nonlinearly constrained optimization. They are particularly effective for solving a sequence of related problems, such as those arising in mixed-integer nonlinear programming and the optimization of functions subject to differential equation constraints. Recently, there has been considerable interest in the formulation of stabilized SQP methods, which are specifically designed to handle degenerate optimization problems. Existing stabilized SQP methods are essentially local in the sense that both the formulation and analysis focus on the properties of the methods in a neighborhood of a solution. A new SQP method is proposed that has favorable global convergence properties yet, under suitable assumptions, is equivalent to a variant of the conventional stabilized SQP method in the neighborhood of a solution. The method combines a primal-dual generalized augmented Lagrangian function with a flexible line search to obtain a sequence of improving estimates of the solution. The method incorporates a convexification algorithm that allows the use of exact second derivatives to define a convex quadratic programming (QP) subproblem without requiring that the Hessian of the Lagrangian be positive definite in the neighborhood of a solution. This gives the potential for fast convergence in the neighborhood of a solution. Additional benefits of the method are that each QP subproblem is regularized and the QP subproblem always has a known feasible point. Numerical experiments are presented for a subset of the problems from the CUTEr test collection. © 2013 Society for Industrial and Applied Mathematics.