no code implementations • 27 Feb 2024 • Kyriakos Axiotis, Vincent Cohen-Addad, Monika Henzinger, Sammy Jerome, Vahab Mirrokni, David Saulpic, David Woodruff, Michael Wunder
We study the data selection problem, whose aim is to select a small representative subset of data that can be used to efficiently train a machine learning model.
no code implementations • 27 Feb 2024 • Taisuke Yasuda, Kyriakos Axiotis, Gang Fu, Mohammadhossein Bateni, Vahab Mirrokni
Neural network pruning is a key technique towards engineering large yet scalable, interpretable, and generalizable models.
no code implementations • 10 Nov 2023 • Kyriakos Axiotis, Sami Abu-al-haija, Lin Chen, Matthew Fahrbach, Gang Fu
We demonstrate the success of Greedy PIG on a wide variety of tasks, including image feature attribution, graph compression/explanation, and post-hoc feature selection on tabular data.
no code implementations • 14 Jul 2023 • Kyriakos Axiotis, Taisuke Yasuda
We give the first recovery guarantees for the Group LASSO for sparse convex optimization with vector-valued features.
no code implementations • 26 Jun 2023 • Kyriakos Axiotis, Maxim Sviridenko
We show that running gradient descent with variable learning rate guarantees loss $f(x) \leq 1. 1 \cdot f(x^*) + \epsilon$ for the logistic regression objective, where the error $\epsilon$ decays exponentially with the number of iterations and polynomially with the magnitude of the entries of an arbitrary fixed solution $x^*$.
no code implementations • 11 Apr 2022 • Kyriakos Axiotis, Maxim Sviridenko
We propose a simple modification to the iterative hard thresholding (IHT) algorithm, which recovers asymptotically sparser solutions as a function of the condition number.
no code implementations • 5 Mar 2021 • Kyriakos Axiotis, Adam Karczmarz, Anish Mukherjee, Piotr Sankowski, Adrian Vladu
This paper bridges discrete and continuous optimization approaches for decomposable submodular function minimization, in both the standard and parametric settings.
no code implementations • ICLR 2021 • Kyriakos Axiotis, Maxim Sviridenko
We propose greedy and local search algorithms for rank-constrained convex optimization, namely solving $\underset{\mathrm{rank}(A)\leq r^*}{\min}\, R(A)$ given a convex function $R:\mathbb{R}^{m\times n}\rightarrow \mathbb{R}$ and a parameter $r^*$.
no code implementations • ICML 2020 • Kyriakos Axiotis, Maxim Sviridenko
We present a new Adaptively Regularized Hard Thresholding (ARHT) algorithm that makes significant progress on this problem by bringing the bound down to $\gamma=O(\kappa)$, which has been shown to be tight for a general class of algorithms including LASSO, OMP, and IHT.