
Assistant Professor Contactoffice: Rhodes Hall 218 
I'm broadly interested in the mathematics of data science, particularly the interplay of optimization, signal processing, statistics, and machine learning.
Stochastic subgradient method converges on tame functions
Updated (8/2019) [abstract]
Nonsmooth and nonconvex optimization under statistical assumptions
Updated (4/2019) [abstract]
Active strict saddles in nonsmooth optimization
Damek Davis, Dmitriy Drusvyatskiy
Manuscript (2019)
Stochastic algorithms with geometric step decay converge linearly on sharp functions
Damek Davis, Dmitriy Drusvyatskiy, Vasileios Charisopoulos
Manuscript (2019) [code]
Lowrank matrix recovery with composite optimization: good conditioning and rapid convergence
Vasileios Charisopoulos, Yudong Chen, Damek Davis, Mateo Díaz, Lijun
Ding, Dmitriy Drusvyatskiy
Manuscript (2019) [code]
Graphical Convergence of Subgradients in Nonconvex Optimization and Learning
Damek Davis, Dmitriy Drusvyatskiy
Mathematics of Operations Research (to appear)
Stochastic modelbased minimization of weakly convex functions.
Damek Davis, Dmitriy Drusvyatskiy
SIAM Journal on Optimization (to appear) [blog]
INFORMS Optimization Society Young Researchers Prize (2019)
Stochastic subgradient method converges on tame functions.
Damek Davis, Dmitriy Drusvyatskiy, Sham Kakade, Jason D. Lee
Foundations of Computational Mathematics (to appear)
Finalist for the Best Paper Prize for Young Researchers in Continuous Optimization (2019)
The nonsmooth landscape of phase retrieval
Damek Davis, Dmitriy Drusvyatskiy, Courtney Paquette
IMA Journal on Numerical Analysis (to appear)
A ThreeOperator Splitting Scheme and its Optimization Applications.
Damek Davis, Wotao Yin
SetValued and Variational Analysis (2017) [code] [slides]
Convergence rate analysis of several splitting schemes
Damek Davis, Wotao Yin
Splitting Methods in Communication and Imaging, Science and Engineering (2017) [video] [slides] [summary]
Winner of the 2014 INFORMS optimization society best student paper prize.
Sloan Research Fellowship in Mathematics (2020)
INFORMS Optimization Society Young Researchers Prize (2019)
Finalist for the Best Paper Prize for Young Researchers in Continuous Optimization (2019)
Finalist for A. W. Tucker Prize for outstanding doctoral thesis (2018)
NSF Math Postdoctoral Fellowship (2015)
Pacific Journal of Mathematics Dissertation Prize (2015)
INFORMS Optimization Society Student Paper Prize (2014)
NSF Graduate Research Fellowship (2010)
Elected to Phi Beta Kappa (2009)
I have given or will give talks or poster presentations of my work at Cornell (20152017); University of California, Los Angeles (20102015, 2017); Stanford (2015); University of Washington, Seattle (2015); University of Waterloo (2015); University of Wisconsin, Madison (2015, 2016); CVPR (2014); International Symposium on Mathematical Programming (2015, 2018); Institute for Mathematics and its Applications (2016); INFORMS annual meeting (20142017,2019); INFORMS international meeting (2016); SIAM Imaging conference (2016); SIAM annual meeting (2016); and Google, NYC (2016, 2017); SIAM Optimization Conference (2017); EUROPT (2017); MIT ORC (2018); Princeton ORFE (2019); ICCOPT (2019)
I was on the program committee for OPT2016 at NIPS. I regularly review articles for several optimization journals and conferences. I have chaired sessions at the INFORMS 2016 international meeting and the INFORMS 2017 Annual Meeting. I have sat on panels that help students with fellowship applications, spoken at the Ithaca High School mathematics seminar, and I have helped students individually, too.
I am a field member of ORIE, Math, and CAM
Current PhD Students
Vasilis Charisopoulos (ORIE)
Mateo Díaz (CAM)
Ben Grimmer (ORIE, Joint with J. Renegar)
Current Undergraduate Students
None
Past Undergraduate Students
Naijia (Anna) Dong (ORIE)
Next position: Columbia U. Masters Program in Operations Research
Optimization: Structure, Duality, Calculus, and Algorithms
Draft of F’19 notes for my course ORIE 6300
(Last Update: 1/2020)
Current:
(S’2020) ORIE 4740 Statistical Data Mining I
Past:
(F’18) ORIE 3300/5300: Optimization I
(S’18) Math 2940: Linear Algebra for Engineers
(S’17) ORIE 4350 Game Theory
(12152019) Dmitriy Drusvyatskiy and I just uploaded a new paper entitled Active strict saddles in nonsmooth optimization.
(10282019) Dmitriy Drusvyatskiy and I updated our paper on “Robust stochastic optimization with the proximal point method.” The updated paper, now called From low probability to high confidence in stochastic convex optimization, also has two new coauthors: Lin Xiao and Junyu Zhang.