Nested Multiple Instance Learning with Attention Mechanisms

1 Nov 2021  ·  Saul Fuster, Trygve Eftestøl, Kjersti Engan ·

Strongly supervised learning requires detailed knowledge of truth labels at instance levels, and in many machine learning applications this is a major drawback. Multiple instance learning (MIL) is a popular weakly supervised learning method where truth labels are not available at instance level, but only at bag-of-instances level. However, sometimes the nature of the problem requires a more complex description, where a nested architecture of bag-of-bags at different levels can capture underlying relationships, like similar instances grouped together. Predicting the latent labels of instances or inner-bags might be as important as predicting the final bag-of-bags label but is lost in a straightforward nested setting. We propose a Nested Multiple Instance with Attention (NMIA) model architecture combining the concept of nesting with attention mechanisms. We show that NMIA performs as conventional MIL in simple scenarios and can grasp a complex scenario providing insights to the latent labels at different levels.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here