HyperAIHyperAI
2 months ago

Benchmarking Micro-action Recognition: Dataset, Methods, and Applications

Guo, Dan ; Li, Kun ; Hu, Bin ; Zhang, Yan ; Wang, Meng
Benchmarking Micro-action Recognition: Dataset, Methods, and
  Applications
Abstract

Micro-action is an imperceptible non-verbal behaviour characterised bylow-intensity movement. It offers insights into the feelings and intentions ofindividuals and is important for human-oriented applications such as emotionrecognition and psychological assessment. However, the identification,differentiation, and understanding of micro-actions pose challenges due to theimperceptible and inaccessible nature of these subtle human behaviors ineveryday life. In this study, we innovatively collect a new micro-actiondataset designated as Micro-action-52 (MA-52), and propose a benchmark namedmicro-action network (MANet) for micro-action recognition (MAR) task. Uniquely,MA-52 provides the whole-body perspective including gestures, upper- andlower-limb movements, attempting to reveal comprehensive micro-action cues. Indetail, MA-52 contains 52 micro-action categories along with seven body partlabels, and encompasses a full array of realistic and natural micro-actions,accounting for 205 participants and 22,422 video instances collated from thepsychological interviews. Based on the proposed dataset, we assess MANet andother nine prevalent action recognition methods. MANet incorporates squeeze-andexcitation (SE) and temporal shift module (TSM) into the ResNet architecturefor modeling the spatiotemporal characteristics of micro-actions. Then ajoint-embedding loss is designed for semantic matching between video and actionlabels; the loss is used to better distinguish between visually similar yetdistinct micro-action categories. The extended application in emotionrecognition has demonstrated one of the important values of our proposeddataset and method. In the future, further exploration of human behaviour,emotion, and psychological assessment will be conducted in depth. The datasetand source code are released at https://github.com/VUT-HFUT/Micro-Action.