1 code implementation • 12 Oct 2023 • Olivier Laurent, Emanuel Aldea, Gianni Franchi
The distribution of the weights of modern deep neural networks (DNNs) - crucial for uncertainty quantification and robustness - is an eminently complex object due to its extremely high dimensionality.
1 code implementation • 17 Aug 2023 • Xuanlong Yu, Gianni Franchi, Jindong Gu, Emanuel Aldea
In this work, we propose a generalized AuxUE scheme for more robust uncertainty quantification on regression tasks.
1 code implementation • 20 Jul 2022 • Gianni Franchi, Xuanlong Yu, Andrei Bursuc, Emanuel Aldea, Severine Dubuisson, David Filliat
Predictive uncertainty estimation is essential for deploying Deep Neural Networks in real-world autonomous systems.
3 code implementations • 2 Mar 2022 • Gianni Franchi, Xuanlong Yu, Andrei Bursuc, Angel Tena, Rémi Kazmierczak, Séverine Dubuisson, Emanuel Aldea, David Filliat
However, disentangling the different types and sources of uncertainty is non trivial for most datasets, especially since there is no ground truth for uncertainty.
no code implementations • 24 Feb 2022 • Xuanlong Yu, Gianni Franchi, Emanuel Aldea
To this end, this paper will introduce a taxonomy and summary of CAR approaches, a new uncertainty estimation solution for CAR, and a set of experiments on depth accuracy and uncertainty quantification for CAR-based models on KITTI dataset.
1 code implementation • 21 Oct 2021 • Xuanlong Yu, Gianni Franchi, Emanuel Aldea
It has become critical for deep learning algorithms to quantify their output uncertainties to satisfy reliability constraints and provide accurate results.
no code implementations • 13 Jul 2021 • Mai Lan Ha, Gianni Franchi, Emanuel Aldea, Volker Blanz
NDA transforms deep features to become more discriminative and, therefore, improves the performances in various tasks.
2 code implementations • 4 Dec 2020 • Gianni Franchi, Andrei Bursuc, Emanuel Aldea, Severine Dubuisson, Isabelle Bloch
Bayesian neural networks (BNNs) have been long considered an ideal, yet unscalable solution for improving the robustness and the predictive uncertainty of deep neural networks.
Ranked #132 on Image Classification on CIFAR-10
no code implementations • 1 Jun 2020 • Gianni Franchi, Andrei Bursuc, Emanuel Aldea, Severine Dubuisson, Isabelle Bloch
This is due to the fact that modern DNNs are usually uncalibrated and we cannot characterize their epistemic uncertainty.
no code implementations • ECCV 2020 • Gianni Franchi, Andrei Bursuc, Emanuel Aldea, Severine Dubuisson, Isabelle Bloch
During training, the weights of a Deep Neural Network (DNN) are optimized from a random initialization towards a nearly optimum value minimizing a loss function.
no code implementations • 7 Feb 2019 • Jennifer Vandoni, Emanuel Aldea, Sylvie Le Hégarat-Mascle
In this work, we use the Belief Function Theory which extends the probabilistic framework in order to provide uncertainty bounds to different categories of crowd density estimators.
no code implementations • 2 Aug 2018 • Nicola Pellicanò, Emanuel Aldea, Sylvie Le Hégarat-Mascle
This paper addresses the problem of head detection in crowded environments.
no code implementations • 10 Jul 2018 • Sylvie Le Hégarat-Mascle, Emanuel Aldea, Jennifer Vandoni
In this work, we introduce a strategy which relies on the use of a cumulative space of reduced dimensionality, derived from the coupling of a classic (Hough) cumulative space with an integral histogram trick.
no code implementations • 23 Mar 2018 • Nicola Pellicanò, Sylvie Le Hégarat-Mascle, Emanuel Aldea
This paper introduces an innovative approach for handling 2D compound hypotheses within the Belief Function Theory framework.
no code implementations • 1 Aug 2013 • Emanuel Aldea, Khurom H. Kiyani
In this paper we address the problem of multiple camera calibration in the presence of a homogeneous scene, and without the possibility of employing calibration object based methods.