Andrew Gordon Wilson


Scalable Gaussian Processes for Scientific Discovery
A seminar talk at EPFL. In this talk I introduce Gaussian processes, and outline a philosophy for model construction and scalable inference. I present several works, including spectral mixture kernels, Kronecker inference, and deep kernel learning, alongside scientific applications. Many of these methods are now available in the new GPyTorch library. Various tutorial resources are also available.
Lausanne, Switzerland, February 2016

Bayesian Deep Learning
A talk on Bayesian deep learning, including a philosophy for model construction, understanding loss surfaces, and a function space view of machine learning, at the Toronto Deep Learning Summer School (DLSS).
Toronto, Canada, July 2018

Bayesian GAN
An invited talk on the Bayesian GAN, including an introduction to generative adversarial networks, and Bayesian deep learning, at the BIRS Workshop at the Interface of Machine Learning and Statistics.
Banff, Canada, January 2018

Bayesian Optimization with Gradients
A NIPS 2017 oral, presented jointly with Peter Frazier, on Bayesian optimization with gradients. This talk is about how to best exploit derivative information with Bayesian optimization.
Long Beach, USA, December 2017

Stochastic Variational Deep Kernel Learning
A short three minute video introducing our NIPS 2016 paper. This method is implemented in GPyTorch.
Montreal, Canada, December 2016

Kernel Interpolation for Scalable Structured Gaussian Processes
Presenting KISS-GP at ICML 2015. This method is implemented in GPyTorch. A Matlab tutorial for this method is also available. Further thoughts on KISS-GP.
Lille, France, July 2015

Other talks can be found linked in my paper list.