no code implementations • 7 Apr 2024 • Mohamed El Amine Seddik, Suei-Wen Chen, Soufiane Hayou, Pierre Youssef, Merouane Debbah
With the aim of rigorously understanding model collapse in language models, we consider in this paper a statistical model that allows us to characterize the impact of various recursive training scenarios.
1 code implementation • 16 Feb 2024 • Hugo Lebeau, Mohamed El Amine Seddik, José Henrique de Morais Goulart
We study the estimation of a planted signal hidden in a recently introduced nested matrix-tensor model, which is an extension of the classical spiked rank-one tensor model, motivated by multi-view clustering.
1 code implementation • 10 Jan 2024 • Mayug Maniparambil, Raiymbek Akshulakov, Yasser Abdelaziz Dahou Djilali, Sanath Narayan, Mohamed El Amine Seddik, Karttikeya Mangalam, Noel E. O'Connor
In the absence of statistical similarity in aligned encoders like CLIP, we show that a possible matching of unaligned encoders exists without any training.
no code implementations • 28 Oct 2023 • Mohamed El Amine Seddik, Maxime Guillaud, Alexis Decurninge, José Henrique de Morais Goulart
This work introduces an asymptotic study of Hotelling-type tensor deflation in the presence of noise, in the regime of large tensor dimensions.
no code implementations • 31 May 2023 • Mohamed El Amine Seddik, Mastane Achab, Henrique Goulart, Merouane Debbah
In order to study the theoretical performance of this approach, we characterize the behavior of this best rank-one approximation in terms of the alignments of the obtained component vectors with the hidden model parameter vectors, in the large-dimensional regime.
no code implementations • 20 Apr 2023 • Mohamed El Amine Seddik, José Henrique de Morais Goulart, Maxime Guillaud
This paper studies the deflation algorithm when applied to estimate a low-rank symmetric spike contained in a large tensor corrupted by additive Gaussian noise.
no code implementations • 11 Feb 2023 • Mohamed El Amine Seddik, Mohammed Mahfoud, Merouane Debbah
Relying on recently developed random tensor tools, this paper deals precisely with the non-orthogonal case by deriving an asymptotic analysis of a parameterized deflation procedure performed on an order-three and rank-two spiked tensor.
no code implementations • 16 Nov 2022 • Mohamed El Amine Seddik, Maxime Guillaud, Alexis Decurninge
Leveraging on recent advances in random tensor theory, we consider in this paper a rank-$r$ asymmetric spiked tensor model of the form $\sum_{i=1}^r \beta_i A_i + W$ where $\beta_i\geq 0$ and the $A_i$'s are rank-one tensors such that $\langle A_i, A_j \rangle\in [0, 1]$ for $i\neq j$, based on which we provide an asymptotic study of Hotelling-type tensor deflation in the large dimensional regime.
no code implementations • 23 Dec 2021 • Mohamed El Amine Seddik, Maxime Guillaud, Romain Couillet
Relying on random matrix theory (RMT), this paper studies asymmetric order-$d$ spiked tensor models with Gaussian noise.
1 code implementation • 4 Sep 2021 • Mohamed El Amine Seddik, Changmin Wu, Johannes F. Lutzeyer, Michalis Vazirgiannis
The robustness of the much-used Graph Convolutional Networks (GCNs) to perturbations of their input is becoming a topic of increasing importance.
1 code implementation • 18 Feb 2021 • Abdallah Benzine, Mohamed El Amine Seddik, Julien Desmarais
These networks, although effective in multiple tasks such as classification or object detection, tend to focus on the most discriminative part of an object rather than retrieving all its relevant features.
Ranked #4 on Person Re-Identification on CUHK03 detected
no code implementations • ICML 2020 • Mohamed El Amine Seddik, Cosme Louart, Mohamed Tamaazousti, Romain Couillet
This paper shows that deep learning (DL) representations of data produced by generative adversarial nets (GANs) are random vectors which fall within the class of so-called \textit{concentrated} random vectors.
no code implementations • ICLR 2019 • Mohamed El Amine Seddik, Mohamed Tamaazousti, Romain Couillet
In this paper, we present a random matrix approach to recover sparse principal components from n p-dimensional vectors.
no code implementations • 4 Apr 2019 • John Lin, Mohamed El Amine Seddik, Mohamed Tamaazousti, Youssef Tamaazousti, Adrien Bartoli
We propose a novel learning approach, in the form of a fully-convolutional neural network (CNN), which automatically and consistently removes specular highlights from a single image by generating its diffuse component.
1 code implementation • 27 Feb 2019 • Mohamed El Amine Seddik, Mohamed Tamaazousti, John Lin
In this paper, we present a general framework named \textit{Generative Collaborative Networks} (GCN), where the idea consists in optimizing the \textit{generator} (the mapping of interest) in the feature space of a \textit{features extractor} network.
1 code implementation • 27 Dec 2017 • Youssef Tamaazousti, Hervé Le Borgne, Céline Hudelot, Mohamed El Amine Seddik, Mohamed Tamaazousti
We also propose a unified framework of the methods based on the diversifying of the training problem.