HyperAIHyperAI
2 months ago

MaskHand: Generative Masked Modeling for Robust Hand Mesh Reconstruction in the Wild

Saleem, Muhammad Usama ; Pinyoanuntapong, Ekkasit ; Patel, Mayur Jagdishbhai ; Xue, Hongfei ; Helmy, Ahmed ; Das, Srijan ; Wang, Pu
MaskHand: Generative Masked Modeling for Robust Hand Mesh Reconstruction
  in the Wild
Abstract

Reconstructing a 3D hand mesh from a single RGB image is challenging due tocomplex articulations, self-occlusions, and depth ambiguities. Traditionaldiscriminative methods, which learn a deterministic mapping from a 2D image toa single 3D mesh, often struggle with the inherent ambiguities in 2D-to-3Dmapping. To address this challenge, we propose MaskHand, a novel generativemasked model for hand mesh recovery that synthesizes plausible 3D hand meshesby learning and sampling from the probabilistic distribution of the ambiguous2D-to-3D mapping process. MaskHand consists of two key components: (1) aVQ-MANO, which encodes 3D hand articulations as discrete pose tokens in alatent space, and (2) a Context-Guided Masked Transformer that randomly masksout pose tokens and learns their joint distribution, conditioned on corruptedtoken sequence, image context, and 2D pose cues. This learned distributionfacilitates confidence-guided sampling during inference, producing meshreconstructions with low uncertainty and high precision. Extensive evaluationson benchmark and real-world datasets demonstrate that MaskHand achievesstate-of-the-art accuracy, robustness, and realism in 3D hand meshreconstruction. Project website:https://m-usamasaleem.github.io/publication/MaskHand/MaskHand.html.