no code implementations • 3 Jun 2019 • Emilio Rafael Balda, Arash Behboodi, Niklas Koep, Rudolf Mathar
To study how robustness generalizes, recent works assume that the inputs have bounded $\ell_2$-norm in order to bound the adversarial risk for $\ell_\infty$ attacks with no explicit dimension dependence.
no code implementations • ICLR 2019 • Emilio Rafael Balda, Arash Behboodi, Rudolf Mathar
Studying the evolution of information theoretic quantities during Stochastic Gradient Descent (SGD) learning of Artificial Neural Networks (ANNs) has gained popularity in recent years.
1 code implementation • 25 Apr 2019 • Arya Bangun, Arash Behboodi, Rudolf Mathar
It is first shown that random sensing matrices, which consists of random samples of Wigner D-functions, satisfy the Restricted Isometry Property (RIP) with a proper preconditioning and can be used for sparse recovery on the rotation group.
Information Theory Information Theory
no code implementations • 29 Jan 2019 • Peter Langenberg, Emilio Rafael Balda, Arash Behboodi, Rudolf Mathar
In this work, this problem is studied through the lens of compression which is captured by the low-rank structure of weight matrices.
no code implementations • 15 Dec 2018 • Emilio Rafael Balda, Arash Behboodi, Rudolf Mathar
The framework can be used to propose novel attacks against learning algorithms for classification and regression tasks under various new constraints with closed form solutions in many instances.
no code implementations • 21 Mar 2018 • Linchen Xiao, Arash Behboodi, Rudolf Mathar
Considered as a data-driven approach, Fingerprinting Localization Solutions (FPSs) enjoy huge popularity due to their good performance and minimal environment information requirement.
1 code implementation • 9 Mar 2018 • Emilio Rafael Balda, Arash Behboodi, Rudolf Mathar
Moreover, this framework is capable of explaining various existing adversarial methods and can be used to derive new algorithms as well.
no code implementations • ICLR 2018 • Emilio Rafael Balda, Arash Behboodi, Rudolf Mathar
We carry out a tensor analysis on the expressive power inter-connections on convolutional arithmetic circuits (ConvACs) and relate our results to standard convolutional networks.
no code implementations • 21 Mar 2016 • Martijn Arts, Marius Cordts, Monika Gorin, Marc Spehr, Rudolf Mathar
It is shown that the presented network converges to equilibrium points which are solutions to general non-negative least squares optimization problems.