Learning with coarse labels
4 papers with code • 4 benchmarks • 4 datasets
Learning fine-grained representation with coarsely-labelled dataset, which can significantly reduce the labelling cost. As a simple example, for the task of differentiation between different pets, we need a knowledgeable cat lover to distinguish between ‘British short’ and ‘Siamese’, but even a child annotator may help to discriminate between ‘cat’ and ‘non-cat’.
Most implemented papers
Weakly Supervised Representation Learning with Coarse Labels
To mitigate this challenge, we propose an algorithm to learn the fine-grained patterns for the target task, when only its coarse-class labels are available.
Fine-grained Angular Contrastive Learning with Coarse Labels
A very practical example of C2FS is when the target classes are sub-classes of the training classes.
MaskCon: Masked Contrastive Learning for Coarse-Labelled Dataset
More specifically, within the contrastive learning framework, for each sample our method generates soft-labels with the aid of coarse labels against other samples and another augmented view of the sample in question.
Weakly supervised segmentation of intracranial aneurysms using a novel 3D focal modulation UNet
In the paper, we propose FocalSegNet, a novel 3D focal modulation UNet, to detect an aneurysm and offer an initial, coarse segmentation of it from time-of-flight MRA image patches, which is further refined with a dense conditional random field (CRF) post-processing layer to produce a final segmentation map.