no code implementations • 23 May 2024 • Peter Súkeník, Marco Mondelli, Christoph Lampert
Deep neural networks (DNNs) exhibit a surprising structure in their final layer known as neural collapse (NC), and a growing body of works has currently investigated the propagation of neural collapse to earlier layers of DNNs -- a phenomenon called deep neural collapse (DNC).
no code implementations • 21 Feb 2024 • Daniel Beaglehole, Peter Súkeník, Marco Mondelli, Mikhail Belkin
Deep Recursive Feature Machines are a method that constructs a neural network by iteratively mapping the data with the AGOP and applying an untrained random feature map.
no code implementations • 11 Oct 2022 • Peter Kocsis, Peter Súkeník, Guillem Brasó, Matthias Nießner, Laura Leal-Taixé, Ismail Elezi
This allows us to improve the generalization of a CNN-based model without any increase in the number of weights at test time.
no code implementations • 29 Aug 2022 • Peter Súkeník, Christoph H. Lampert
Modern machine learning tasks often require considering not just one but multiple objectives.
no code implementations • 11 Oct 2021 • Peter Súkeník, Aleksei Kuvshinov, Stephan Günnemann
We show that in general, the input-dependent smoothing suffers from the curse of dimensionality, forcing the variance function to have low semi-elasticity.