HyperAI
Accueil
Actualités
Articles de recherche récents
Tutoriels
Ensembles de données
Wiki
SOTA
Modèles LLM
Classement GPU
Événements
Recherche
À propos
Français
HyperAI
Toggle sidebar
Rechercher sur le site...
⌘
K
Accueil
SOTA
Génération d'images
Image Generation On Imagenet 32X32
Image Generation On Imagenet 32X32
Métriques
bpd
Résultats
Résultats de performance de divers modèles sur ce benchmark
Columns
Nom du modèle
bpd
Paper Title
Repository
NVAE w/ flow
3.92
NVAE: A Deep Hierarchical Variational Autoencoder
Glow (Kingma and Dhariwal, 2018)
4.09
Glow: Generative Flow with Invertible 1x1 Convolutions
MintNet
4.06
MintNet: Building Invertible Neural Networks with Masked Convolutions
Residual Flow
4.01
Residual Flows for Invertible Generative Modeling
VDM
3.72
Variational Diffusion Models
SPN Menick and Kalchbrenner (2019)
3.85
Generating High Fidelity Images with Subscale Pixel Networks and Multidimensional Upscaling
-
StyleGAN-XL
-
StyleGAN-XL: Scaling StyleGAN to Large Diverse Datasets
δ-VAE
3.77
Preventing Posterior Collapse with delta-VAEs
-
PaGoDA
-
PaGoDA: Progressive Growing of a One-Step Generator from a Low-Resolution Diffusion Teacher
Very Deep VAE
3.8
Very Deep VAEs Generalize Autoregressive Models and Can Outperform Them on Images
PixelRNN
3.86
Pixel Recurrent Neural Networks
Hourglass
3.74
Hierarchical Transformers Are More Efficient Language Models
DDPM++ (VP, NLL) + ST
3.85
Soft Truncation: A Universal Training Technique of Score-based Diffusion Model for High Precision Score Estimation
i-DODE
3.43
Improved Techniques for Maximum Likelihood Estimation for Diffusion ODEs
MRCNF
3.77
Multi-Resolution Continuous Normalizing Flows
Flow++
3.86
Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design
BIVA Maaloe et al. (2019)
3.96
BIVA: A Very Deep Hierarchy of Latent Variables for Generative Modeling
Reflected Diffusion
3.74
Reflected Diffusion Models
NDM
3.55
Neural Diffusion Models
-
DDPM
3.89
Denoising Diffusion Probabilistic Models
0 of 33 row(s) selected.
Previous
Next