HyperAI超神经
首页
资讯
最新论文
教程
数据集
百科
SOTA
LLM 模型天梯
GPU 天梯
顶会
开源项目
全站搜索
关于
中文
HyperAI超神经
Toggle sidebar
全站搜索…
⌘
K
首页
SOTA
Domain Adaptation
Domain Adaptation On Office Home
Domain Adaptation On Office Home
评估指标
Accuracy
评测结果
各个模型在此基准测试上的表现结果
Columns
模型名称
Accuracy
Paper Title
Repository
CMKD
89.0
Unsupervised Domain Adaption Harnessing Vision-Language Pre-training
DAOD
69.8
Open Set Domain Adaptation: Theoretical Bound and Algorithm
FixBi
72.7
FixBi: Bridging Domain Spaces for Unsupervised Domain Adaptation
RCL
90.0
Empowering Source-Free Domain Adaptation with MLLM-driven Curriculum Learning
PGA (ViT-B/16)
85.1
Enhancing Domain Adaptation through Prompt Gradient Alignment
SWG
92.3
Combining inherent knowledge of vision-language models with unsupervised domain adaptation through strong-weak guidance
SHOT
71.8
Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation
MEDM
69.5
Entropy Minimization vs. Diversity Maximization for Domain Adaptation
MEDA (ResNet50)
67.6
Transfer Learning with Dynamic Distribution Adaptation
CoVi
73.1
Contrastive Vicinal Space for Unsupervised Domain Adaptation
GSDE
73.6
Gradual Source Domain Expansion for Unsupervised Domain Adaptation
MCC+NWD
72.6
Reusing the Task-specific Classifier as a Discriminator: Discriminator-free Adversarial Domain Adaptation
PGA (ViT-L/14)
89.4
Enhancing Domain Adaptation through Prompt Gradient Alignment
PMTrans
89.0
Patch-Mix Transformer for Unsupervised Domain Adaptation: A Game Perspective
-
CDTrans (DeiT-B)
80.5
CDTrans: Cross-domain Transformer for Unsupervised Domain Adaptation
BIWAA
71.5
Backprop Induced Feature Weighting for Adversarial Domain Adaptation with Iterative Label Distribution Alignment
ELS
84.6
Free Lunch for Domain Adversarial Training: Environment Label Smoothing
MIC
86.2
MIC: Masked Image Consistency for Context-Enhanced Domain Adaptation
SDAT (ViT-B/16)
84.3
A Closer Look at Smoothness in Domain Adversarial Training
EvoADA
73.9
On Evolving Attention Towards Domain Adaptation
-
0 of 29 row(s) selected.
Previous
Next