Linear evaluation

66 papers with code • 1 benchmarks • 1 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Linear evaluation models and implementations
4 papers
2,783
4 papers
1,695
3 papers
1,364
See all 5 libraries.

Datasets


Most implemented papers

Bootstrap your own latent: A new approach to self-supervised Learning

deepmind/deepmind-research 13 Jun 2020

From an augmented view of an image, we train the online network to predict the target network representation of the same image under a different augmented view.

Emerging Properties in Self-Supervised Vision Transformers

facebookresearch/dino ICCV 2021

In this paper, we question if self-supervised learning provides new properties to Vision Transformer (ViT) that stand out compared to convolutional networks (convnets).

Bootstrap Your Own Latent - A New Approach to Self-Supervised Learning

open-mmlab/mmselfsup NeurIPS 2020

From an augmented view of an image, we train the online network to predict the target network representation of the same image under a different augmented view.

Self-Supervised Learning with Swin Transformers

SwinTransformer/Transformer-SSL 10 May 2021

We are witnessing a modeling shift from CNN to Transformers in computer vision.

Solo-learn: A Library of Self-supervised Methods for Visual Representation Learning

lightly-ai/lightly 3 Aug 2021

This paper presents solo-learn, a library of self-supervised methods for visual representation learning.

Contrastive Multi-View Representation Learning on Graphs

kavehhassani/mvgrl ICML 2020

We achieve new state-of-the-art results in self-supervised learning on 8 out of 8 node and graph classification benchmarks under the linear evaluation protocol.

Learning Representations by Maximizing Mutual Information Across Views

philip-bachman/amdim-public NeurIPS 2019

Following our proposed approach, we develop a model which learns image representations that significantly outperform prior methods on the tasks we consider.

BYOL works even without batch statistics

lucidrains/byol-pytorch 20 Oct 2020

Bootstrap Your Own Latent (BYOL) is a self-supervised learning approach for image representation.

Matrix Information Theory for Self-Supervised Learning

yifanzhang-pro/matrix-ssl 27 May 2023

Inspired by this framework, we introduce Matrix-SSL, a novel approach that leverages matrix information theory to interpret the maximum entropy encoding loss as matrix uniformity loss.

How Useful is Self-Supervised Pretraining for Visual Tasks?

princeton-vl/selfstudy CVPR 2020

We investigate what factors may play a role in the utility of these pretraining methods for practitioners.