Recently, there has been increasing interest in first-order methods for conic linear programming, an extension of linear programming in which scalar inequalities are replaced with vector inequalities defined relative to convex cones. Conic programming is an important class of optimization problems with a wide range of applications including in operations research, finance, and engineering. There is a rich history of extensive research into methods for solving them efficiently and accurately.
First-order methods show potential to improve the efficiency of conic programming solvers since they circumvent the need to factor matrices or solve linear systems, operations which can be especially computationally expensive when modeling complex problems or problems with large data sets. In this work, we examine two first-order methods for solving conic linear programs using a self-dual embedding whose solution yields a primal-dual optimal solution or a certificate of primal or dual infeasibility. These methods require no feasible starting points, and the most expensive operations are matrix-vector multiplications or projections onto cones which makes it easy to customize the methods to exploit problem structure.
We compare two methods, the primal-dual hybrid gradient method and the extragradient method, with a suite of practical improvements. We test these methods on linear programs and other non-polyhedral conic linear programs. The numerical experiments indicate that these methods solve the self-dual embedding efficiently and accurately.