HyperAIHyperAI
2 months ago

UniHCP: A Unified Model for Human-Centric Perceptions

Ci, Yuanzheng ; Wang, Yizhou ; Chen, Meilin ; Tang, Shixiang ; Bai, Lei ; Zhu, Feng ; Zhao, Rui ; Yu, Fengwei ; Qi, Donglian ; Ouyang, Wanli
UniHCP: A Unified Model for Human-Centric Perceptions
Abstract

Human-centric perceptions (e.g., pose estimation, human parsing, pedestriandetection, person re-identification, etc.) play a key role in industrialapplications of visual models. While specific human-centric tasks have theirown relevant semantic aspect to focus on, they also share the same underlyingsemantic structure of the human body. However, few works have attempted toexploit such homogeneity and design a general-propose model for human-centrictasks. In this work, we revisit a broad range of human-centric tasks and unifythem in a minimalist manner. We propose UniHCP, a Unified Model forHuman-Centric Perceptions, which unifies a wide range of human-centric tasks ina simplified end-to-end manner with the plain vision transformer architecture.With large-scale joint training on 33 human-centric datasets, UniHCP canoutperform strong baselines on several in-domain and downstream tasks by directevaluation. When adapted to a specific task, UniHCP achieves new SOTAs on awide range of human-centric tasks, e.g., 69.8 mIoU on CIHP for human parsing,86.18 mA on PA-100K for attribute prediction, 90.3 mAP on Market1501 for ReID,and 85.8 JI on CrowdHuman for pedestrian detection, performing better thanspecialized models tailored for each task.