We present an algorithm to compute the pseudospectral abscissa for a nonlinear eigenvalue problem. The algorithm relies on global under-estimator and over-estimator functions for the eigenvalue and singular value functions involved. These global models follow from eigenvalue perturbation theory. The algorithm has three particular features. First, it converges to the globally rightmost point of the pseudospectrum, and it is immune to nonsmoothness. The global convergence assertion is under the assumption that a global lower bound is available for the second derivative of a singular value function depending on one parameter. It may not be easy to deduce such a lower bound analytically, but assigning large negative values works robustly in practice. Second, it is applicable to large-scale problems since the dominant cost per iteration stems from computing the smallest singular value and associated singular vectors, for which efficient iterative solvers can be used. Furthermore, a significant increase in computational efficiency can be obtained by subspace acceleration, that is, by restricting the domains of the linear maps associated with the matrices involved to small but suitable subspaces, and solving the resulting reduced problems. Occasional restarts of these subspaces further enhance the efficiency for large-scale problems. Finally, in contrast to existing iterative approaches based on constructing low-rank perturbations and rightmost eigenvalue computations, the algorithm relies on computing only singular values of complex matrices. Hence, the algorithm does not require solutions of nonlinear eigenvalue problems, thereby further increasing efficiency and reliability. This work is accompanied by a robust implementation of the algorithm that is publicly available.