HyperAIHyperAI
2 months ago

Unleashing HyDRa: Hybrid Fusion, Depth Consistency and Radar for Unified 3D Perception

Wolters, Philipp ; Gilg, Johannes ; Teepe, Torben ; Herzog, Fabian ; Laouichi, Anouar ; Hofmann, Martin ; Rigoll, Gerhard
Unleashing HyDRa: Hybrid Fusion, Depth Consistency and Radar for Unified
  3D Perception
Abstract

Low-cost, vision-centric 3D perception systems for autonomous driving havemade significant progress in recent years, narrowing the gap to expensiveLiDAR-based methods. The primary challenge in becoming a fully reliablealternative lies in robust depth prediction capabilities, as camera-basedsystems struggle with long detection ranges and adverse lighting and weatherconditions. In this work, we introduce HyDRa, a novel camera-radar fusionarchitecture for diverse 3D perception tasks. Building upon the principles ofdense BEV (Bird's Eye View)-based architectures, HyDRa introduces a hybridfusion approach to combine the strengths of complementary camera and radarfeatures in two distinct representation spaces. Our Height AssociationTransformer module leverages radar features already in the perspective view toproduce more robust and accurate depth predictions. In the BEV, we refine theinitial sparse representation by a Radar-weighted Depth Consistency. HyDRaachieves a new state-of-the-art for camera-radar fusion of 64.2 NDS (+1.8) and58.4 AMOTA (+1.5) on the public nuScenes dataset. Moreover, our newsemantically rich and spatially accurate BEV features can be directly convertedinto a powerful occupancy representation, beating all previous camera-basedmethods on the Occ3D benchmark by an impressive 3.7 mIoU. Code and models areavailable at https://github.com/phi-wol/hydra.

Unleashing HyDRa: Hybrid Fusion, Depth Consistency and Radar for Unified 3D Perception | Latest Papers | HyperAI