no code implementations • 3 Dec 2020 • Victor Bouvier, Philippe Very, Clément Chastagnol, Myriam Tami, Céline Hudelot
First, we select for annotation target samples that are likely to improve the representations' transferability by measuring the variation, before and after annotation, of the transferability loss gradient.
no code implementations • 24 Jun 2020 • Victor Bouvier, Philippe Very, Clément Chastagnol, Myriam Tami, Céline Hudelot
The emergence of Domain Invariant Representations (IR) has improved drastically the transferability of representations from a labelled source domain to a new and unlabelled target domain.
no code implementations • 25 Sep 2019 • Victor Bouvier, Céline Hudelot, Clément Chastagnol, Philippe Very, Myriam Tami
Second, we show that learning weighted representations plays a key role in relaxing the constraint of invariance and then preserving the risk of compression.
no code implementations • 29 Jul 2019 • Victor Bouvier, Philippe Very, Céline Hudelot, Clément Chastagnol
Learning representations which remain invariant to a nuisance factor has a great interest in Domain Adaptation, Transfer Learning, and Fair Machine Learning.
no code implementations • 29 Jul 2019 • Victor Bouvier, Philippe Very, Céline Hudelot, Clément Chastagnol
Such approach consists in learning a representation of the data such that the label distribution conditioned on this representation is domain invariant.