DenseNets Reloaded: Paradigm Shift Beyond ResNets and ViTs

This paper revives Densely Connected Convolutional Networks (DenseNets) andreveals the underrated effectiveness over predominant ResNet-stylearchitectures. We believe DenseNets' potential was overlooked due to untouchedtraining methods and traditional design elements not fully revealing theircapabilities. Our pilot study shows dense connections through concatenation arestrong, demonstrating that DenseNets can be revitalized to compete with modernarchitectures. We methodically refine suboptimal components - architecturaladjustments, block redesign, and improved training recipes towards wideningDenseNets and boosting memory efficiency while keeping concatenation shortcuts.Our models, employing simple architectural elements, ultimately surpass SwinTransformer, ConvNeXt, and DeiT-III - key architectures in the residuallearning lineage. Furthermore, our models exhibit near state-of-the-artperformance on ImageNet-1K, competing with the very recent models anddownstream tasks, ADE20k semantic segmentation, and COCO objectdetection/instance segmentation. Finally, we provide empirical analyses thatuncover the merits of the concatenation over additive shortcuts, steering arenewed preference towards DenseNet-style designs. Our code is available athttps://github.com/naver-ai/rdnet.