1 code implementation • 2 May 2024 • Patricia Pauli, Dennis Gramlich, Frank Allgöwer
This paper is devoted to the estimation of the Lipschitz constant of neural networks using semidefinite programming.
no code implementations • 18 Mar 2024 • Patricia Pauli, Dennis Gramlich, Fran Allgöwer
For this reason, we explicitly provide a state space representation of the Roesser type for 2-D convolutional layers with $c_\mathrm{in}r_1 + c_\mathrm{out}r_2$ states, where $c_\mathrm{in}$/$c_\mathrm{out}$ is the number of input/output channels of the layer and $r_1$/$r_2$ characterizes the width/length of the convolution kernel.
no code implementations • 6 Mar 2023 • Dennis Gramlich, Patricia Pauli, Carsten W. Scherer, Frank Allgöwer, Christian Ebenbauer
This paper introduces a novel representation of convolutional Neural Networks (CNNs) in terms of 2-D dynamical systems.
no code implementations • 28 Nov 2022 • Patricia Pauli, Dennis Gramlich, Frank Allgöwer
In this work, we propose a dissipativity-based method for Lipschitz constant estimation of 1D convolutional neural networks (CNNs).
1 code implementation • 3 Jan 2022 • Patricia Pauli, Niklas Funcke, Dennis Gramlich, Mohamed Amine Msalmi, Frank Allgöwer
This paper is concerned with the training of neural networks (NNs) under semidefinite constraints, which allows for NN training with robustness and stability guarantees.
1 code implementation • 31 Mar 2021 • Patricia Pauli, Dennis Gramlich, Julian Berberich, Frank Allgöwer
In this paper, we analyze the stability of feedback interconnections of a linear time-invariant system with a neural network nonlinearity in discrete time.