no code implementations • 21 Oct 2023 • Pierre Colombo, Victor Pellegrain, Malik Boudiaf, Victor Storchan, Myriam Tami, Ismail Ben Ayed, Celine Hudelot, Pablo Piantanida
First, we introduce a scenario where the embedding of a pre-trained model is served through a gated API with compute-cost and data-privacy constraints.
no code implementations • 16 Oct 2023 • Mouad El Bouchattaoui, Myriam Tami, Benoit Lepetit, Paul-Henry Cournède
Under unconfoundedness, we target the Individual Treatment Effect (ITE) estimation with unobserved heterogeneity in the treatment response due to missing risk factors.
1 code implementation • CVPR 2023 • Malik Boudiaf, Etienne Bennequin, Myriam Tami, Antoine Toubhans, Pablo Piantanida, Céline Hudelot, Ismail Ben Ayed
We tackle the Few-Shot Open-Set Recognition (FSOSR) problem, i. e. classifying instances among a set of classes for which we only have a few labeled samples, while simultaneously detecting instances that do not belong to any known class.
1 code implementation • 18 Jun 2022 • Malik Boudiaf, Etienne Bennequin, Myriam Tami, Celine Hudelot, Antoine Toubhans, Pablo Piantanida, Ismail Ben Ayed
Through extensive experiments spanning 5 datasets, we show that OSTIM surpasses both inductive and existing transductive methods in detecting open-set instances while competing with the strongest transductive methods in classifying closed-set instances.
1 code implementation • 10 May 2022 • Etienne Bennequin, Myriam Tami, Antoine Toubhans, Celine Hudelot
Every day, a new method is published to tackle Few-Shot Image Classification, showing better and better performances on academic benchmarks.
no code implementations • 15 Oct 2021 • Victor Pellegrain, Myriam Tami, Michel Batteux, Céline Hudelot
The increasing complexity of Industry 4. 0 systems brings new challenges regarding predictive maintenance tasks such as fault detection and diagnosis.
1 code implementation • 25 May 2021 • Etienne Bennequin, Victor Bouvier, Myriam Tami, Antoine Toubhans, Céline Hudelot
To classify query instances from novel classes encountered at test-time, they only require a support set composed of a few labelled samples.
1 code implementation • 26 Dec 2020 • Yassine Ouali, Céline Hudelot, Myriam Tami
In this paper, we explore contrastive learning for few-shot classification, in which we propose to use it as an additional auxiliary training objective acting as a data-dependent regularizer to promote more general and transferable features.
no code implementations • 3 Dec 2020 • Victor Bouvier, Philippe Very, Clément Chastagnol, Myriam Tami, Céline Hudelot
First, we select for annotation target samples that are likely to improve the representations' transferability by measuring the variation, before and after annotation, of the transferability loss gradient.
no code implementations • NeurIPS 2020 • Sami Alkhoury, Emilie Devijver, Marianne Clausel, Myriam Tami, Eric Gaussier, Georges Oppenheim
We propose here a generalization of regression trees, referred to as Probabilistic Regression (PR) trees, that adapt to the smoothness of the prediction function relating input and output variables while preserving the interpretability of the prediction and being robust to noise.
1 code implementation • ECCV 2020 • Yassine Ouali, Céline Hudelot, Myriam Tami
In this work, we propose a new unsupervised image segmentation approach based on mutual information maximization between different constructed views of the inputs.
Ranked #5 on Unsupervised Semantic Segmentation on COCO-Stuff-3
no code implementations • 25 Jun 2020 • Yassine Ouali, Victor Bouvier, Myriam Tami, Céline Hudelot
Learning Invariant Representations has been successfully applied for reconciling a source and a target domain for Unsupervised Domain Adaptation.
no code implementations • 24 Jun 2020 • Victor Bouvier, Philippe Very, Clément Chastagnol, Myriam Tami, Céline Hudelot
The emergence of Domain Invariant Representations (IR) has improved drastically the transferability of representations from a labelled source domain to a new and unlabelled target domain.
1 code implementation • 9 Jun 2020 • Yassine Ouali, Céline Hudelot, Myriam Tami
Deep neural networks demonstrated their ability to provide remarkable performances on a wide range of supervised learning tasks (e. g., image classification) when trained on extensive collections of labeled data (e. g., ImageNet).
5 code implementations • CVPR 2020 • Yassine Ouali, Céline Hudelot, Myriam Tami
To leverage the unlabeled examples, we enforce a consistency between the main decoder predictions and those of the auxiliary decoders, taking as inputs different perturbed versions of the encoder's output, and consequently, improving the encoder's representations.
no code implementations • 25 Sep 2019 • Victor Bouvier, Céline Hudelot, Clément Chastagnol, Philippe Very, Myriam Tami
Second, we show that learning weighted representations plays a key role in relaxing the constraint of invariance and then preserving the risk of compression.
no code implementations • 27 Oct 2018 • Myriam Tami, Marianne Clausel, Emilie Devijver, Adrien Dulac, Eric Gaussier, Stefan Janaqi, Meriam Chebre
Tree-based ensemble methods, as Random Forests and Gradient Boosted Trees, have been successfully used for regression in many applications and research studies.