I am a fifth year PhD student in Operations Research at Cornell University, where I am working with Jim Renegar and Damek Davis. I spent Spring 2020 with Google Research working on adversial optimization and Fall 2017 at UC Berkeley as part of a Simons Institute program bridging continuous and discrete optimization.
My current research focuses on algorithms for continuous optimization problems where very little structure is assumed. For example, the selected works below all address optimization problems that are outside the scope of the traditional theory (those papers address nonconvex-nonconcave minimax optimization which shows up in adversial ML training and reinforcement learning, handling nonconvex nonsmooth optimization, and handling non-Lipschitz optimization respectively). I was awarded an NSF Fellowship in 2017 supporting this research.
Office: 295 Rhodes Hall
Email: bdg79 at cornell.edu
|The Landscape of Nonconvex-Nonconcave Minimax Optimization||arXiv|
|Benjamin Grimmer, Haihao Lu, Pratik Worah, Vahab Mirrokni.|
|Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems||SIAM Journal on Optimization, 2019|
|Damek Davis, Benjamin Grimmer.||arXiv, Julia|
|Radial Subgradient Method||SIAM Journal on Optimization, 2018|
|Benjamin Grimmer.||arXiv, Julia|