I am a fifth year PhD student in Operations Research at Cornell University, where I am working with Jim Renegar and Damek Davis. I spent Spring 2020 with Google Research working on adversarial optimization and Fall 2017 at UC Berkeley as part of a Simons Institute program bridging continuous and discrete optimization.
My current research focuses on the design and analysis of algorithms for continuous optimization problems beyond the limited areas where classical theory applies. For example, the selected works below all address fundamental methodological issues that arise in modern optimization problems, bridging the gap between classical approaches and the potentially stochastic, nonconvex, nonsmooth, adversarial models employed on many modern data science and machine learning problems. I was awarded an NSF Fellowship in 2017 supporting this research.
Office: 295 Rhodes Hall
Email: bdg79 at cornell.edu
|The Landscape of the Proximal Point Method for Nonconvex-Nonconcave Minimax Optimization||arXiv|
|Benjamin Grimmer, Haihao Lu, Pratik Worah, Vahab Mirrokni.|
|Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems||SIAM Journal on Optimization, 2019|
|Damek Davis, Benjamin Grimmer.||arXiv, Julia|
|Radial Subgradient Method||SIAM Journal on Optimization, 2018|
|Benjamin Grimmer.||arXiv, Julia|