no code implementations • 10 Oct 2022 • Ahmet Iscen, Thomas Bird, Mathilde Caron, Alireza Fathi, Cordelia Schmid
We study class-incremental learning, a training setup in which new classes of data are observed over time for the model to learn from.
no code implementations • 26 Apr 2021 • Thomas Bird, Johannes Ballé, Saurabh Singh, Philip A. Chou
We unify these steps by directly compressing an implicit representation of the scene, a function that maps spatial coordinates to a radiance vector field, which can then be queried to render arbitrary viewpoints.
no code implementations • ICLR 2021 • Thomas Bird, Friso H. Kingma, David Barber
In this work we show, for the first time, that we can successfully train generative models which utilize binary neural networks.
1 code implementation • ICLR 2020 • James Townsend, Thomas Bird, Julius Kunze, David Barber
We make the following striking observation: fully convolutional VAE models trained on 32x32 ImageNet can generalize well, not just to 64x64 but also to far larger photographs, with no changes to the model.
no code implementations • 27 Jul 2019 • Mingtian Zhang, Thomas Bird, Raza Habib, Tianlin Xu, David Barber
Probabilistic models are often trained by maximum likelihood, which corresponds to minimizing a specific f-divergence between the model and data distribution.
no code implementations • 27 Sep 2018 • Mingtian Zhang, Thomas Bird, Raza Habib, Tianlin Xu, David Barber
Probabilistic models are often trained by maximum likelihood, which corresponds to minimizing a specific form of f-divergence between the model and data distribution.
no code implementations • 13 Sep 2018 • Thomas Bird, Julius Kunze, David Barber
These approaches are of particular interest because they are parallelizable.