HyperAIHyperAI
2 months ago

Learning Multi-dimensional Edge Feature-based AU Relation Graph for Facial Action Unit Recognition

Luo, Cheng ; Song, Siyang ; Xie, Weicheng ; Shen, Linlin ; Gunes, Hatice
Learning Multi-dimensional Edge Feature-based AU Relation Graph for
  Facial Action Unit Recognition
Abstract

The activations of Facial Action Units (AUs) mutually influence one another.While the relationship between a pair of AUs can be complex and unique,existing approaches fail to specifically and explicitly represent such cues foreach pair of AUs in each facial display. This paper proposes an AU relationshipmodelling approach that deep learns a unique graph to explicitly describe therelationship between each pair of AUs of the target facial display. Ourapproach first encodes each AU's activation status and its association withother AUs into a node feature. Then, it learns a pair of multi-dimensional edgefeatures to describe multiple task-specific relationship cues between each pairof AUs. During both node and edge feature learning, our approach also considersthe influence of the unique facial display on AUs' relationship by taking thefull face representation as an input. Experimental results on BP4D and DISFAdatasets show that both node and edge feature learning modules provide largeperformance improvements for CNN and transformer-based backbones, with our bestsystems achieving the state-of-the-art AU recognition results. Our approach notonly has a strong capability in modelling relationship cues for AU recognitionbut also can be easily incorporated into various backbones. Our PyTorch code ismade available.

Learning Multi-dimensional Edge Feature-based AU Relation Graph for Facial Action Unit Recognition | Latest Papers | HyperAI