HyperAIHyperAI
2 months ago

BECLR: Batch Enhanced Contrastive Few-Shot Learning

Poulakakis-Daktylidis, Stylianos ; Jamali-Rad, Hadi
BECLR: Batch Enhanced Contrastive Few-Shot Learning
Abstract

Learning quickly from very few labeled samples is a fundamental attributethat separates machines and humans in the era of deep representation learning.Unsupervised few-shot learning (U-FSL) aspires to bridge this gap by discardingthe reliance on annotations at training time. Intrigued by the success ofcontrastive learning approaches in the realm of U-FSL, we structurally approachtheir shortcomings in both pretraining and downstream inference stages. Wepropose a novel Dynamic Clustered mEmory (DyCE) module to promote a highlyseparable latent representation space for enhancing positive sampling at thepretraining phase and infusing implicit class-level insights into unsupervisedcontrastive learning. We then tackle the, somehow overlooked yet critical,issue of sample bias at the few-shot inference stage. We propose an iterativeOptimal Transport-based distribution Alignment (OpTA) strategy and demonstratethat it efficiently addresses the problem, especially in low-shot scenarioswhere FSL approaches suffer the most from sample bias. We later on discuss thatDyCE and OpTA are two intertwined pieces of a novel end-to-end approach (wecoin as BECLR), constructively magnifying each other's impact. We then presenta suite of extensive quantitative and qualitative experimentation tocorroborate that BECLR sets a new state-of-the-art across ALL existing U-FSLbenchmarks (to the best of our knowledge), and significantly outperforms thebest of the current baselines (codebase available at:https://github.com/stypoumic/BECLR).

BECLR: Batch Enhanced Contrastive Few-Shot Learning | Latest Papers | HyperAI