no code implementations • 3 Sep 2023 • Youssef Marzouk, Zhi Ren, Sven Wang, Jakob Zech
Ordinary differential equations (ODEs), via their induced flow maps, provide a powerful framework to parameterize invertible transformations for the purpose of representing complex probability distributions.
no code implementations • 19 Jul 2023 • Christoph Schwab, Andreas Stein, Jakob Zech
We establish universality and expression rate bounds for a class of neural Deep Operator Networks (DON) emulating Lipschitz (or H\"older) continuous maps $\mathcal G:\mathcal X\to\mathcal Y$ between (subsets of) separable Hilbert spaces $\mathcal X$, $\mathcal Y$.
no code implementations • 11 Jul 2022 • Lukas Herrmann, Christoph Schwab, Jakob Zech
Specifically, we study approximation rates for Deep Neural Operator and Generalized Polynomial Chaos (gpc) Operator surrogates for nonlinear, holomorphic maps between infinite-dimensional, separable Hilbert spaces.
no code implementations • 14 Jan 2022 • Marcello Longo, Joost A. A. Opschoor, Nico Disch, Christoph Schwab, Jakob Zech
Our construction and DNN architecture generalizes previous results in that no geometric restrictions on the regular simplicial partitions $\mathcal{T}$ of $\Omega$ are required for DNN emulation.
no code implementations • 13 Nov 2021 • Christoph Schwab, Jakob Zech
For artificial deep neural networks, we prove expression rates for analytic functions $f:\mathbb{R}^d\to\mathbb{R}$ in the norm of $L^2(\mathbb{R}^d,\gamma_d)$ where $d\in {\mathbb{N}}\cup\{ \infty \}$.