Deep Outdoor Illumination Estimation

We present a CNN-based technique to estimate high-dynamic range outdoorillumination from a single low dynamic range image. To train the CNN, weleverage a large dataset of outdoor panoramas. We fit a low-dimensionalphysically-based outdoor illumination model to the skies in these panoramasgiving us a compact set of parameters (including sun position, atmosphericconditions, and camera parameters). We extract limited field-of-view imagesfrom the panoramas, and train a CNN with this large set of input image--outputlighting parameter pairs. Given a test image, this network can be used to inferillumination parameters that can, in turn, be used to reconstruct an outdoorillumination environment map. We demonstrate that our approach allows therecovery of plausible illumination conditions and enables photorealisticvirtual object insertion from a single image. An extensive evaluation on boththe panorama dataset and captured HDR environment maps shows that our techniquesignificantly outperforms previous solutions to this problem.