HyperAIHyperAI

Command Palette

Search for a command to run...

Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?

Li Zheng ; Li Yuxuan ; Zhao Penghai ; Song Renjie ; Li Xiang ; Yang Jian

Abstract

Diffusion models have recently achieved astonishing performance in generatinghigh-fidelity photo-realistic images. Given their huge success, it is stillunclear whether synthetic images are applicable for knowledge distillation whenreal images are unavailable. In this paper, we extensively study whether andhow synthetic images produced from state-of-the-art diffusion models can beused for knowledge distillation without access to real images, and obtain threekey conclusions: (1) synthetic data from diffusion models can easily lead tostate-of-the-art performance among existing synthesis-based distillationmethods, (2) low-fidelity synthetic images are better teaching materials, and(3) relatively weak classifiers are better teachers. Code is available athttps://github.com/zhengli97/DM-KD.


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp