no code implementations • 19 Jan 2024 • Minkai Xu, Jiaqi Han, Aaron Lou, Jean Kossaifi, Arvind Ramanathan, Kamyar Azizzadenesheli, Jure Leskovec, Stefano Ermon, Anima Anandkumar
Modeling the complex three-dimensional (3D) dynamics of relational systems is an important problem in the natural sciences, with applications ranging from molecular simulations to particle mechanics.
no code implementations • 21 Nov 2023 • Bram Wallace, Meihua Dang, Rafael Rafailov, Linqi Zhou, Aaron Lou, Senthil Purushwalkam, Stefano Ermon, Caiming Xiong, Shafiq Joty, Nikhil Naik
Large language models (LLMs) are fine-tuned using human comparison data with Reinforcement Learning from Human Feedback (RLHF) methods to make them better aligned with users' preferences.
1 code implementation • 25 Oct 2023 • Aaron Lou, Chenlin Meng, Stefano Ermon
Experimentally, we test our Score Entropy Discrete Diffusion models (SEDD) on standard language modeling tasks.
1 code implementation • 29 Sep 2023 • Linqi Zhou, Aaron Lou, Samar Khanna, Stefano Ermon
However, for many applications such as image editing, the model input comes from a distribution that is not random noise.
1 code implementation • 10 Apr 2023 • Aaron Lou, Stefano Ermon
To incorporate data constraints in a principled manner, we present Reflected Diffusion Models, which instead reverse a reflected stochastic differential equation evolving on the support of the data.
Ranked #1 on Image Generation on CIFAR-10 (Inception score metric)
2 code implementations • NeurIPS 2021 • Tolga Birdal, Aaron Lou, Leonidas Guibas, Umut Şimşekli
Disobeying the classical wisdom of statistical learning theory, modern deep neural networks generalize well even though they typically contain millions of parameters.
no code implementations • 29 Sep 2021 • Aaron Lou, Maximilian Nickel, Mustafa Mukadam, Brandon Amos
We present Deep Riemannian Manifolds, a new class of neural network parameterized Riemannian manifolds that can represent and learn complex geometric structures.
1 code implementation • NeurIPS 2021 • Isay Katsman, Aaron Lou, Derek Lim, Qingxuan Jiang, Ser-Nam Lim, Christopher De Sa
Tractably modelling distributions over manifolds has long been an important goal in the natural sciences.
3 code implementations • NeurIPS 2020 • Aaron Lou, Derek Lim, Isay Katsman, Leo Huang, Qingxuan Jiang, Ser-Nam Lim, Christopher De Sa
To better conform to data geometry, recent deep generative modelling techniques adapt Euclidean constructions to non-Euclidean spaces.
2 code implementations • ICML 2020 • Aaron Lou, Isay Katsman, Qingxuan Jiang, Serge Belongie, Ser-Nam Lim, Christopher De Sa
Recent advances in deep representation learning on Riemannian manifolds extend classical deep learning operations to better capture the geometry of the manifold.
no code implementations • 4 Dec 2018 • Horace He, Aaron Lou, Qingxuan Jiang, Isay Katsman, Serge Belongie, Ser-Nam Lim
Research has shown that widely used deep neural networks are vulnerable to carefully crafted adversarial perturbations.