ORIE 6326: Convex OptimizationOverviewConvex optimization generalizes leastsquares, linear and quadratic programming, and semidefinite programming, and forms the basis of many methods for nonconvex optimization. This course focuses on recognizing and solving convex optimization problems that arise in applications, and introduces a few algorithms for convex optimization. Topics include: Convex sets, functions, and optimization problems. Basics of convex analysis. Leastsquares, linear and quadratic programs, semidefinite programming, minimax, extremal volume, and other problems. Optimality conditions, duality theory, theorems of alternative, and applications. Algorithms: interiorpoint, subgradient, proximal gradient, operator splitting methods such as ADMM, stochastic gradient, branch and bound methods. Applications to statistics and machine learning, signal processing, control and mechanical engineering, and finance. Prerequisites: Strong working knowledge of linear algebra, a modern scripting language (such as Python, Matlab, Julia, R). Some analysis (open and closed sets, boundaries and interiors, limits) will also be extremely useful. TopicsThe treatment in this class draws strongly on other excellent courses in convex optimization. The lectures from the first part of the course follow Stephen Boyd's EE364a. For the second half, on algorithms, we have borrowed from Stephen Boyd's EE364b, Lieven Vandenberghe's EE236c, Ryu and Boyd's primer on monotone operator methods, Pontus Gisselson's course on largescale convex optimization, Sebastian Bubeck's book Convex Optimization: Algorithms and Complexity, and Gabriel Goh's analysis of momentum.
