Using Similarity to Accelerate the Primal-Dual Hybrid Gradient Algorithm for Linear Programming
This thesis investigates the use of several acceleration schemes to improve the performance of the primal-dual hybrid gradient algorithm for linear programming (PDLP), a state-of-the-art first-order method for linear programming (LP). The three main acceleration schemes investigated are: Polyak heavy ball momentum, Nesterov acceleration, and steering vectors selected in a residual momentum directi