Sat2Density: Faithful Density Learning from Satellite-Ground Image Pairs

This paper aims to develop an accurate 3D geometry representation ofsatellite images using satellite-ground image pairs. Our focus is on thechallenging problem of 3D-aware ground-views synthesis from a satellite image.We draw inspiration from the density field representation used in volumetricneural rendering and propose a new approach, called Sat2Density. Our methodutilizes the properties of ground-view panoramas for the sky and non-skyregions to learn faithful density fields of 3D scenes in a geometricperspective. Unlike other methods that require extra depth information duringtraining, our Sat2Density can automatically learn accurate and faithful 3Dgeometry via density representation without depth supervision. This advancementsignificantly improves the ground-view panorama synthesis task. Additionally,our study provides a new geometric perspective to understand the relationshipbetween satellite and ground-view images in 3D space.