no code implementations • 13 Feb 2024 • Valentin De Bortoli, Michael Hutchinson, Peter Wirnsberger, Arnaud Doucet
Denoising Score Matching estimates the score of a noised version of a target distribution by minimizing a regression loss and is widely used to train the popular class of Denoising Diffusion Models.
1 code implementation • 11 Apr 2023 • Nic Fishman, Leo Klarner, Valentin De Bortoli, Emile Mathieu, Michael Hutchinson
Denoising diffusion models are a novel class of generative algorithms that achieve state-of-the-art performance across a range of domains, including image generation and text-to-image tasks.
no code implementations • 28 Sep 2022 • Angus Phillips, Thomas Seror, Michael Hutchinson, Valentin De Bortoli, Arnaud Doucet, Emile Mathieu
Score-based generative modelling (SGM) has proven to be a very effective method for modelling densities on finite-dimensional spaces.
no code implementations • 7 Jul 2022 • James Thornton, Michael Hutchinson, Emile Mathieu, Valentin De Bortoli, Yee Whye Teh, Arnaud Doucet
Our proposed method generalizes Diffusion Schr\"odinger Bridge introduced in \cite{debortoli2021neurips} to the non-Euclidean setting and extends Riemannian score-based models beyond the first time reversal.
2 code implementations • 6 Feb 2022 • Valentin De Bortoli, Emile Mathieu, Michael Hutchinson, James Thornton, Yee Whye Teh, Arnaud Doucet
Score-based generative models (SGMs) are a powerful class of generative models that exhibit remarkable empirical performance.
no code implementations • NeurIPS 2021 • Michael Hutchinson, Alexander Terenin, Viacheslav Borovitskiy, So Takao, Yee Whye Teh, Marc Peter Deisenroth
Gaussian processes are machine learning models capable of learning unknown functions in a way that represents uncertainty, thereby facilitating construction of optimal decision-making systems.
1 code implementation • 20 Dec 2020 • Michael Hutchinson, Charline Le Lan, Sheheryar Zaidi, Emilien Dupont, Yee Whye Teh, Hyunjik Kim
Group equivariant neural networks are used as building blocks of group invariant neural networks, which have been shown to improve generalisation performance and data efficiency through principled parameter sharing.
1 code implementation • 25 Nov 2020 • Peter Holderrieth, Michael Hutchinson, Yee Whye Teh
Motivated by objects such as electric fields or fluid streams, we study the problem of learning stochastic fields, i. e. stochastic processes whose samples are fields like those occurring in physics and engineering.
1 code implementation • 24 Nov 2019 • Mrinank Sharma, Michael Hutchinson, Siddharth Swaroop, Antti Honkela, Richard E. Turner
This setting is known as federated learning, in which privacy is a key concern.