HyperAIHyperAI

Command Palette

Search for a command to run...

Cross Domain Ensemble Distillation for Domain Generalization

Suha Kwak Sungyeon Kim kyungmoon lee

Abstract

For domain generalization, the task of learning a model that generalizes to unseen target domains utilizing multiple source domains, many approaches explicitly align the distribution of the domains. However, the optimization for domain alignment has a risk of overfitting since the target domain is not available. To address the issue, this paper proposes a method for domain generalization by employing self-distillation. The proposed method aims to train a model robust to domain shift by allowing meaningful erroneous predictions in multiple domains. Specifically, our method matches the ensemble of predictive distributions of data with the same class label but different domains with each predictive distribution. We also propose a de-stylization method that standardizes feature maps of images to help produce consistent predictions. Image classification experiments on two benchmarks demonstrated that the proposed method greatly improves performance in both single-source and multi-source settings. We also show that the proposed method works effectively in person-reID experiments. In all experiments, our method significantly improves the performance.


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Cross Domain Ensemble Distillation for Domain Generalization | Papers | HyperAI