no code implementations • 1 Dec 2023 • Ehsan Beikihassan, Amy K. Hoover, Ioannis Koutis, Ali Parviz, Niloofar Aghaieabiane
We consider a setting where a population of artificial learners is given, and the objective is to optimize aggregate measures of performance, under constraints on training resources.
1 code implementation • 6 Oct 2023 • Dominique Beaini, Shenyang Huang, Joao Alex Cunha, Zhiyi Li, Gabriela Moisescu-Pareja, Oleksandr Dymov, Samuel Maddrell-Mander, Callum McLean, Frederik Wenkel, Luis Müller, Jama Hussein Mohamud, Ali Parviz, Michael Craig, Michał Koziarski, Jiarui Lu, Zhaocheng Zhu, Cristian Gabellini, Kerstin Klaser, Josef Dean, Cas Wognum, Maciej Sypetkowski, Guillaume Rabusseau, Reihaneh Rabbany, Jian Tang, Christopher Morris, Ioannis Koutis, Mirco Ravanelli, Guy Wolf, Prudencio Tossou, Hadrien Mary, Therence Bois, Andrew Fitzgibbon, Błażej Banaszewski, Chad Martin, Dominic Masters
Recently, pre-trained foundation models have enabled significant advancements in multiple fields.
no code implementations • 18 Jul 2023 • Shibo Yao, Dantong Yu, Ioannis Koutis
In this paper, we identify the close connection between matrix spectrum learning and neural network training for dense and convolutional layers and argue that weight pruning is essentially a matrix sparsification process to preserve the spectrum.
no code implementations • 7 May 2023 • Ismail Bustany, Andrew B. Kahng, Ioannis Koutis, Bodhisatta Pramanik, Zhiang Wang
State-of-the-art hypergraph partitioners follow the multilevel paradigm that constructs multiple levels of progressively coarser hypergraphs that are used to drive cut refinement on each level of the hierarchy.
no code implementations • 21 Sep 2022 • Niloofar Aghaieabiane, Ioannis Koutis
A widely used approach for extracting information from gene expression data employ the construction of a gene co-expression network and the subsequent application of algorithms that discover network structure.
no code implementations • 29 Sep 2021 • Ali Parviz, Ioannis Koutis
Spectral network embeddings are based on the computation of eigenvectors of a normalized graph Laplacian.
1 code implementation • NeurIPS 2019 • Ioannis Koutis, Huong Le
Applying then spectral clustering on $H$ has the potential to produce improved cuts that also exist in $G$ due to the cut similarity.
no code implementations • ICML 2018 • Daniele Calandriello, Alessandro Lazaric, Ioannis Koutis, Michal Valko
By constructing a spectrally-similar graph, we are able to bound the error induced by the sparsification for a variety of downstream tasks (e. g., SSL).
no code implementations • 21 Jan 2016 • Daniele Calandriello, Alessandro Lazaric, Michal Valko, Ioannis Koutis
While the harmonic function solution performs well in many semi-supervised learning (SSL) tasks, it is known to scale poorly with the number of samples.