HyperAIHyperAI

Command Palette

Search for a command to run...

Wide Activation for Efficient and Accurate Image Super-Resolution

Jiahui Yu Yuchen Fan Jianchao Yang Ning Xu Zhaowen Wang Xinchao Wang Thomas Huang

Abstract

In this report we demonstrate that with same parameters and computationalbudgets, models with wider features before ReLU activation have significantlybetter performance for single image super-resolution (SISR). The resulted SRresidual network has a slim identity mapping pathway with wider ((2\times) to(4\times)) channels before activation in each residual block. To furtherwiden activation ((6\times) to (9\times)) without computational overhead,we introduce linear low-rank convolution into SR networks and achieve evenbetter accuracy-efficiency tradeoffs. In addition, compared with batchnormalization or no normalization, we find training with weight normalizationleads to better accuracy for deep super-resolution networks. Our proposed SRnetwork \textit{WDSR} achieves better results on large-scale DIV2K imagesuper-resolution benchmark in terms of PSNR with same or lower computationalcomplexity. Based on WDSR, our method also won 1st places in NTIRE 2018Challenge on Single Image Super-Resolution in all three realistic tracks.Experiments and ablation studies support the importance of wide activation forimage super-resolution. Code is released at:https://github.com/JiahuiYu/wdsr_ntire2018


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp