|
![]() |
Assistant Professor Contactoffice: Rhodes Hall 218 |
I'm broadly interested in the mathematics of data science, particularly the interplay of optimization, signal processing, statistics, and machine learning.
Damek Davis, Dmitriy Drusvyatskiy Graphical Convergence of Subgradients in Nonconvex Optimization and Learning (2018)
Damek Davis, Dmitriy Drusvyatskiy Stochastic model-based minimization of weakly convex functions. SIAM Journal on Optimization (to appear)
Damek Davis, Dmitriy Drusvyatskiy, Sham Kakade, Jason D. Lee Stochastic subgradient method converges on tame functions. Foundations of Computational Mathematics (to appear)
Damek Davis, Dmitriy Drusvyatskiy, Courtney Paquette The nonsmooth landscape of phase retrieval. IMA Journal of Numerical Analysis (to appear)
Damek Davis, Wotao Yin. A Three-Operator Splitting Scheme and its Optimization Applications. Set-Valued and Variational Analysis (2017)
Damek Davis, Wotao Yin. Convergence rate analysis of several splitting schemes. R. Glowinski, S. Osher, and W. Yin (Eds.), Splitting Methods in Communication and Imaging, Science and Engineering, New York, Springer (2016).
A. W. Tucker Prize finalist for outstanding doctoral thesis (2018)
NSF Math Postdoctoral Fellowship (2015)
Pacific Journal of Mathematics Dissertation Prize (2015)
INFORMS Optimization Society Student Paper Prize (2014)
NSF Graduate Research Fellowship (2010)
Elected to Phi Beta Kappa (2009)
I have given or will give talks or poster presentations of my work at Cornell (2015-2017); University of California, Los Angeles (2010-2015, 2017); Stanford (2015); University of Washington, Seattle (2015); University of Waterloo (2015); University of Wisconsin, Madison (2015, 2016); CVPR (2014); International Symposium on Mathematical Programming (2015, 2018); Institute for Mathematics and its Applications (2016); INFORMS annual meeting (2014-2017); INFORMS international meeting (2016); SIAM Imaging conference (2016); SIAM annual meeting (2016); and Google, NYC (2016, 2017); SIAM Optimization Conference (2017); EUROPT (2017); Rensselaer Polytechnic Institute (2018); The Statistical and Applied Mathematical Sciences Institute (2018); MIT ORC (2018); DIMACS, Rutgers University (2018); New Computing-Driven Opportunities for Optimization Workshop in Fujian Province, China (2018);
I was on the program committee for OPT2016 at NIPS. I regularly review articles for several optimization journals and conferences. I have chaired sessions at the INFORMS 2016 international meeting and the INFORMS 2017 Annual Meeting. I have sat on panels that help students with fellowship applications, spoken at the Ithaca High School mathematics seminar, and I have helped students individually, too.
I am a field member of ORIE, Math, and CAM
Current PhD Students
Vasilis Charisopoulos (ORIE)
Mateo Díaz (CAM)
Ben Grimmer (ORIE, Joint with J. Renegar)
Current Undergraduate Students
None
Past Undergraduate Students
Naijia (Anna) Dong (ORIE)
Next position: Columbia U. Masters Program in Operations Research
Current:
Past:
(F’18) ORIE 3300/5300: Optimization I
(S’18) Math 2940: Linear Algebra for Engineers
(S’17) ORIE 4350 Game Theory
(1-6-2019) Vasileios Charisopoulos, Mateo Diaz, Dmitriy Drusvyatskiy, and I just uploaded a paper on Composite optimization for robust blind deconvolution.
(12-23-2018) My paper with Sasha Aravkin on Trimmed Statistical Estimation via Variance Reduction has been accepted for publication in Mathematics of Operations Research
(10-16-2018) Dmitriy Drusvyatskiy and I have just uploaded a new paper on Uniform Graphical Convergence of Subgradients in Nonconvex Optimization and Learning (2018)
(10-1-2018) Ben Grimmer and I completed a major revision of our paper on the Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems, which includes new a new technique for reducing the variance of solution estimates in nonsmooth, nonconvex stochastic approximation.
(8-26-2018) Dmitriy Drusvyatskiy expanded our recent paper on Stochastic model-based minimization of weakly convex functions with new results and more algorithm classes.
(8-8-2018) My paper with Dmitriy Drusvyatskiy, Kellie J. MacPhee, and Courtney Paquette on Subgradient methods for sharp weakly convex functions. has been accepted for publication in Journal of Optimization Theory and Applications.
(7-1-2018) Dmitriy Drusvyatskiy, Kellie J. MacPhee, and I have just uploaded a new paper on Stochastic model-based minimization under high-order growth.
(4-20-2018) Dmitriy Drusvyatskiy, Sham Kakade, and Jason D. Lee and I have just uploaded a paper which proves that the Stochastic subgradient method converges on tame functions.
(4-9-2018) Dima Drusvyatskiy and I wrote a blog post explaining our recent work on the stochastic subgradient method on weekly convex functions.
(3-23-2018) My paper with Luis M. Briceño-Arias on the Forward-Backward-Half Forward Algorithm for Solving Monotone Inclusions has been accepted for publication in SIAM Journal on Optimization.
(3-20-2018) Dmitriy Drusvyatskiy and I have just uploaded a paper on Stochastic model-based minimization of weakly convex functions.
(3-8-2018) Dmitriy Drusvyatskiy, Kellie J. MacPhee, Courtney Paquette, and I have just uploaded a paper that analyzes Subgradient methods for sharp weakly convex functions.
(2-21-2018) Dmitriy Druvsyatskiy and I just uploaded a supplementary technical note determining the Complexity of finding near-stationary points of convex functions stochastically.
(2-13-2018) Dmitriy Druvsyatskiy and I just uploaded a paper which shows that the Stochastic subgradient method converges at the rate \(O(k^{-1/4})\) on weakly convex functions.