Madeleine Udell

Photo of Madeleine Udell 

Assistant Professor
Richard and Sybil Smith Sesquicentennial Fellow
Operations Research and Information Engineering (ORIE)
Graduate field member, ORIE, CS, and CAM
Cornell University


office: Rhodes 227


twitter: @madeleineudell

google: madeleine.udell

skype: madeleine.udell

github: madeleineudell

news and links

January 2017. I'll be teaching a class on Convex Optimization at Cornell in Spring of 2017. We'll be roughly following Stanford's EE364a (and some of EE364b), using the excellent textbook by Boyd and Vandenberghe, with an additional emphasis on first order methods.

November 2016. (Most) data scientists did a terrible job predicting the results of the 2016 election. Did that matter for the outcome? I analyze the data in a lecture on the limits  —  and dangers  —  of predictive modeling.

October 2016. Thanks to Cornell's Scientific Software Club for inviting me to give an introduction to Julia, and asking great questions! Here are my slides + demos, which start with basic syntax and proceed to show off advanced capabilities like multi-language integration, shared memory parallelism, and mathematical optimization packages.

September 2016. I'm teaching a new class at Cornell on Learning with Big Messy Data. Interestingly, the course itself is generating a bunch of big messy data, from lecture slides to demos to Piazza posts to project repos. Next step: train an AI to learn how to learn with big messy data from this big messy data?

June 2016. Damek Davis, Brent Edmunds and I just posted a paper on a (provably convergent) stochastic asynchronous optimization method called SAPALM for fitting generalized low rank models. It turns out asynchrony barely affects the rate of convergence (per flop), while providing a linear speedup in the number of flops per second. In other words: it's fast!

May 2016. Congratulations to Ramchandran Muthukumar and Ayush Pandey for their fantastic proposals to work with me on Convex.jl this summer through Google Summer of Code. Ayush will be adding support for complex numbers, while Ramchandran develops a fast presolve routine.

March 2016. It was great meeting incoming PhD students at the ORIE visiting student days! Here are the slides I presented to introduce students to some of my research.

November 2015. H2O is a new framework for large scale machine learning, and has just released a great implementation of generalized low rank models (engineered by Anqi Fu). Here are the slides and the video from my talk at H2O World.