HyperAIHyperAI

Command Palette

Search for a command to run...

DenseNets Reloaded: Paradigm Shift Beyond ResNets and ViTs

Donghyun Kim Byeongho Heo Dongyoon Han

Abstract

This paper revives Densely Connected Convolutional Networks (DenseNets) andreveals the underrated effectiveness over predominant ResNet-stylearchitectures. We believe DenseNets' potential was overlooked due to untouchedtraining methods and traditional design elements not fully revealing theircapabilities. Our pilot study shows dense connections through concatenation arestrong, demonstrating that DenseNets can be revitalized to compete with modernarchitectures. We methodically refine suboptimal components - architecturaladjustments, block redesign, and improved training recipes towards wideningDenseNets and boosting memory efficiency while keeping concatenation shortcuts.Our models, employing simple architectural elements, ultimately surpass SwinTransformer, ConvNeXt, and DeiT-III - key architectures in the residuallearning lineage. Furthermore, our models exhibit near state-of-the-artperformance on ImageNet-1K, competing with the very recent models anddownstream tasks, ADE20k semantic segmentation, and COCO objectdetection/instance segmentation. Finally, we provide empirical analyses thatuncover the merits of the concatenation over additive shortcuts, steering arenewed preference towards DenseNet-style designs. Our code is available athttps://github.com/naver-ai/rdnet.


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
DenseNets Reloaded: Paradigm Shift Beyond ResNets and ViTs | Papers | HyperAI