Andrew
Gordon Wilson
Code repositories for group projects and
close collaborations.
GPyTorch
Implements state-of-the-art scalable Gaussian processes in PyTorch, including:
(1) SKI/KISS-GP [older but helpful tutorials in Matlab here]
(2) Deep Kernel Learning [older but helpful tutorials in Matlab here]
(3) Stochastic Variational Deep Kernel Learning
(4) Scalable Kernel Learning by Stochastic Lanczos Expansions
(5) Spectral Mixture Kernels [older but helpful tutorials in Matlab here]
(6) SKIP (scaling SKI/KISS-GP to higher dimensions)
(7) LOVE (Constant time predictive distributions)
GP Kernel Learning Tutorials
Stochastic Weight Averaging (SWA)
fast-SWA and Semi-Supervised Learning
Word2GM
Implements probabilistic Gaussian mixture word embeddings in Tensorflow.
BayesGAN
Implements the Bayesian GAN in Tensorflow.
Hierarchical Density Order Embeddings
Provides a Torch implementation of our ICLR 2018 paper. In this paper we learn hierarchical representations of concepts using encapsulation of probability densities.
Probabilistic FastText
Provides a C++ implementation of our ACL 2018 paper. In this paper we learn density embeddings that account for sub-word structure and multiple senses.
Gaussian Processes for Machine Learning
Implements state-of-the-art scalable Gaussian processes in PyTorch, including:
(1) SKI/KISS-GP [older but helpful tutorials in Matlab here]
(2) Deep Kernel Learning [older but helpful tutorials in Matlab here]
(3) Stochastic Variational Deep Kernel Learning
(4) Scalable Kernel Learning by Stochastic Lanczos Expansions
(5) Spectral Mixture Kernels [older but helpful tutorials in Matlab here]
(6) SKIP (scaling SKI/KISS-GP to higher dimensions)
(7) LOVE (Constant time predictive distributions)
GP Kernel Learning Tutorials
Tutorials for SKI/KISS-GP,
Spectral
Mixture Kernels, Kronecker
Inference, and Deep
Kernel Learning. The accompanying code is in Matlab
and is now mostly out of date; the implementations in GPyTorch
are typically much more efficient. However, the
tutorial material and code is still very useful for anyone
wanting to understand the building blocks and practical
advice for SKI/KISS-GP, Spectral Mixture Kernels, or
Kronecker Inference.
Stochastic Weight Averaging (SWA)
SWA is a
simple DNN training method that can be used as a
drop-in replacement for SGD with improved
generalization, faster convergence, and essentially no
overhead. In this repository we provide a PyTorch
implementation of SWA.
fast-SWA and Semi-Supervised Learning
Provides a PyTorch implementation of
fast-SWA and the record breaking semi-supervised results
in Improving
Consistency Based Semi-Supervised Learning with Weight
Averaging.
Word2GM
Implements probabilistic Gaussian mixture word embeddings in Tensorflow.
BayesGAN
Implements the Bayesian GAN in Tensorflow.
Hierarchical Density Order Embeddings
Provides a Torch implementation of our ICLR 2018 paper. In this paper we learn hierarchical representations of concepts using encapsulation of probability densities.
Probabilistic FastText
Provides a C++ implementation of our ACL 2018 paper. In this paper we learn density embeddings that account for sub-word structure and multiple senses.
Gaussian Processes for Machine Learning
The iconic GPML toolbox, the official
software accompaniment to the Gaussian processes for
machine learning textbook.
GPML includes native support for Spectral
Mixture Kernels, Kronecker
Inference, and SKI/KISS-GP.
Tutorials
for this material based on GPML can be found here.