Explores the Method of Moving Asymptotes (MMA) by Svanberg as a sequential convex programming method, as a final project for Prof. Stephen Becker's Convex Optimization class (APPM 5630) at CU Boulder.
The notebook demo.ipynb accompanies the the report and presentation. Throughout, an external code by Arjen Deetman is used. This code is based on Svanberg's own code. Comments are added where we reuse some of Arjen Deetman's code. Any liscensing issues should be directed to the original authors.
PyTorch, Numpy, and matplotlib are used extensively. Cooper (as presented here), is used just for the constrained minimization problem wrapper. This setup would then allow us to apply the Primal-Dual Adam / SGD type optimization that is supported by Cooper, with minimal changes (the box constraints would need to be added as inequality constraints). We did not evaluate using constraints with any of the deep learning optimization methods (Adam), but if we did, we would likely use Cooper.
The purpose of these gifs is to show how the convexity of the approximate depends on the asymptotes.
Default initialization of moving asymptotes:
Twice as close moving asymptotes:
Five times as close moving asymptotes:
The purpose of this gif is to show the main ideas of the method of moving asymptotes. This includes the expansion / contraction of the asymptotes, and sequequential convex programming.