no code implementations • 27 Mar 2024 • Jannis Chemseddine, Paul Hagemann, Christian Wald, Gabriele Steidl
In inverse problems, many conditional generative models approximate the posterior measure by minimizing a distance between the joint measure and its learned approximation.
1 code implementation • 5 Feb 2024 • Paul Hagemann, Johannes Hertrich, Maren Casfor, Sebastian Heidenreich, Gabriele Steidl
Motivated by indirect measurements and applications from nanometrology with a mixed noise model, we develop a novel algorithm for jointly estimating the posterior and the noise parameters in Bayesian inverse problems.
no code implementations • 27 Dec 2023 • Moritz Piening, Fabian Altekrüger, Johannes Hertrich, Paul Hagemann, Andrea Walther, Gabriele Steidl
The solution of inverse problems is of fundamental interest in medical and astronomical imaging, geophysics as well as engineering and life sciences.
no code implementations • 20 Oct 2023 • Jannis Chemseddine, Paul Hagemann, Christian Wald
In inverse problems, many conditional generative models approximate the posterior measure by minimizing a distance between the joint measure and its learned approximation.
1 code implementation • 4 Oct 2023 • Paul Hagemann, Johannes Hertrich, Fabian Altekrüger, Robert Beinert, Jannis Chemseddine, Gabriele Steidl
We propose conditional flows of the maximum mean discrepancy (MMD) with the negative distance kernel for posterior sampling and conditional generative modeling.
1 code implementation • 19 May 2023 • Johannes Hertrich, Christian Wald, Fabian Altekrüger, Paul Hagemann
We prove that the MMD of Riesz kernels, which is also known as energy distance, coincides with the MMD of their sliced version.
no code implementations • 28 Mar 2023 • Fabian Altekrüger, Paul Hagemann, Gabriele Steidl
Conditional generative models became a very powerful tool to sample from Bayesian inverse problem posteriors.
1 code implementation • 8 Mar 2023 • Paul Hagemann, Sophie Mildenberger, Lars Ruthotto, Gabriele Steidl, Nicole Tianjiao Yang
We thereby intend to obtain diffusion models that generalize across different resolution levels and improve the efficiency of the training process.
1 code implementation • 24 May 2022 • Fabian Altekrüger, Alexander Denker, Paul Hagemann, Johannes Hertrich, Peter Maass, Gabriele Steidl
Learning neural networks using only few available information is an important ongoing research topic with tremendous potential for applications.
1 code implementation • 24 Nov 2021 • Paul Hagemann, Johannes Hertrich, Gabriele Steidl
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models.
1 code implementation • 23 Sep 2021 • Paul Hagemann, Johannes Hertrich, Gabriele Steidl
To overcome topological constraints and improve the expressiveness of normalizing flow architectures, Wu, K\"ohler and No\'e introduced stochastic normalizing flows which combine deterministic, learnable flow transformations with stochastic sampling methods.
no code implementations • 5 Feb 2021 • Anna Andrle, Nando Farchmin, Paul Hagemann, Sebastian Heidenreich, Victor Soltwisch, Gabriele Steidl
Grazing incidence X-ray fluorescence is a non-destructive technique for analyzing the geometry and compositional parameters of nanostructures appearing e. g. in computer chips.
1 code implementation • 7 Sep 2020 • Paul Hagemann, Sebastian Neumayer
In this paper, we analyze the properties of invertible neural networks, which provide a way of solving inverse problems.