Invertible Image Rescaling

High-resolution digital images are usually downscaled to fit various displayscreens or save the cost of storage and bandwidth, meanwhile the post-upscalingis adpoted to recover the original resolutions or the details in the zoom-inimages. However, typical image downscaling is a non-injective mapping due tothe loss of high-frequency information, which leads to the ill-posed problem ofthe inverse upscaling procedure and poses great challenges for recoveringdetails from the downscaled low-resolution images. Simply upscaling with imagesuper-resolution methods results in unsatisfactory recovering performance. Inthis work, we propose to solve this problem by modeling the downscaling andupscaling processes from a new perspective, i.e. an invertible bijectivetransformation, which can largely mitigate the ill-posed nature of imageupscaling. We develop an Invertible Rescaling Net (IRN) with deliberatelydesigned framework and objectives to produce visually-pleasing low-resolutionimages and meanwhile capture the distribution of the lost information using alatent variable following a specified distribution in the downscaling process.In this way, upscaling is made tractable by inversely passing a randomly-drawnlatent variable with the low-resolution image through the network. Experimentalresults demonstrate the significant improvement of our model over existingmethods in terms of both quantitative and qualitative evaluations of imageupscaling reconstruction from downscaled images.