Search Results for author: Peter Súkeník

Found 5 papers, 0 papers with code

Neural Collapse versus Low-rank Bias: Is Deep Neural Collapse Really Optimal?

no code implementations23 May 2024 Peter Súkeník, Marco Mondelli, Christoph Lampert

Deep neural networks (DNNs) exhibit a surprising structure in their final layer known as neural collapse (NC), and a growing body of works has currently investigated the propagation of neural collapse to earlier layers of DNNs -- a phenomenon called deep neural collapse (DNC).

Binary Classification Multi-class Classification

Average gradient outer product as a mechanism for deep neural collapse

no code implementations21 Feb 2024 Daniel Beaglehole, Peter Súkeník, Marco Mondelli, Mikhail Belkin

Deep Recursive Feature Machines are a method that constructs a neural network by iteratively mapping the data with the AGOP and applying an untrained random feature map.

Generalization In Multi-Objective Machine Learning

no code implementations29 Aug 2022 Peter Súkeník, Christoph H. Lampert

Modern machine learning tasks often require considering not just one but multiple objectives.

Fairness Generalization Bounds +1

Intriguing Properties of Input-dependent Randomized Smoothing

no code implementations11 Oct 2021 Peter Súkeník, Aleksei Kuvshinov, Stephan Günnemann

We show that in general, the input-dependent smoothing suffers from the curse of dimensionality, forcing the variance function to have low semi-elasticity.

Fairness

Cannot find the paper you are looking for? You can Submit a new open access paper.