Deep Probabilistic Machine Learning

This project aimed to develop scalable approaches to deep non-parametric probabilistic models that use approximate inference techniques to learn the structure of the model. The project required the development of practical, interpretable models, with latent variables that can be used by non-academics in a meaningful way.

In this project, we developed scalable approaches for applying Gaussian process regression to multi-output signals with non-linear dependencies. This included using physics-based techniques, including series representation and ordinary differential equation solvers, combined with machine learning techniques such as GPs and autoregressive flows, to infer latent variables and forces. The outputs of this project were the development of two papers, on using Volterra series for non-linear multi-output Gaussian processes (NLMOGP)1, and normalising flows for inferring latent forces in non-linear models2.


  1. MA Álvarez, WOC Ward, C Guarnizo, Non-linear process convolutions for multi-output Gaussian processes. AISTATS 2019 ^
  2. WOC Ward, T Ryder, D Prangle, MA Álvarez, Black-box inference for non-linear latent force models. AISTATS 2020 ^
Avatar
Wil O. C. Ward
Research Associate