Benjamin Grimmer



Something went wrong

I have recently joined the Johns Hopkins Department of Applied Math and Statistics faculty after finishing my PhD in Operations Research at Cornell University, advised by Jim Renegar and Damek Davis. I spent Spring 2020 with Google Research working on adversarial optimization and Fall 2017 at UC Berkeley as part of a Simons Institute program bridging continuous and discrete optimization.

My current research focuses on the design and analysis of algorithms for continuous optimization problems beyond the limited areas where classical theory applies. For example, the selected works below all address fundamental issues in modern optimization problems, bridging the gap between classical approaches and the potentially stochastic, nonconvex, nonsmooth, adversarial models employed on many modern data science and machine learning problems. I was awarded an NSF Fellowship in 2017 supporting this research.


Office: 100 Whitehead Hall
Email: grimmer at jhu.edu
CV: here


Selected Papers

Radial Duality Part I: Foundations and Part II: Applications and Algorithms arXiv: Part I, Part II
Benjamin Grimmer.

The Landscape of the Proximal Point Method for Nonconvex-Nonconcave Minimax Optimization arXiv
Benjamin Grimmer, Haihao Lu, Pratik Worah, Vahab Mirrokni.

Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems SIAM Journal on Optimization, 2019
Damek Davis, Benjamin Grimmer. arXiv, Julia