HyperAIHyperAI
2 months ago

Hierarchical Long Short-Term Concurrent Memory for Human Interaction Recognition

Shu, Xiangbo ; Tang, Jinhui ; Qi, Guo-Jun ; Liu, Wei ; Yang, Jian
Hierarchical Long Short-Term Concurrent Memory for Human Interaction
  Recognition
Abstract

In this paper, we aim to address the problem of human interaction recognitionin videos by exploring the long-term inter-related dynamics among multiplepersons. Recently, Long Short-Term Memory (LSTM) has become a popular choice tomodel individual dynamic for single-person action recognition due to itsability of capturing the temporal motion information in a range. However,existing RNN models focus only on capturing the dynamics of human interactionby simply combining all dynamics of individuals or modeling them as a whole.Such models neglect the inter-related dynamics of how human interactions changeover time. To this end, we propose a novel Hierarchical Long Short-TermConcurrent Memory (H-LSTCM) to model the long-term inter-related dynamics amonga group of persons for recognizing the human interactions. Specifically, wefirst feed each person's static features into a Single-Person LSTM to learn thesingle-person dynamic. Subsequently, the outputs of all Single-Person LSTMunits are fed into a novel Concurrent LSTM (Co-LSTM) unit, which mainlyconsists of multiple sub-memory units, a new cell gate and a new co-memorycell. In a Co-LSTM unit, each sub-memory unit stores individual motioninformation, while this Co-LSTM unit selectively integrates and storesinter-related motion information between multiple interacting persons frommultiple sub-memory units via the cell gate and co-memory cell, respectively.Extensive experiments on four public datasets validate the effectiveness of theproposed H-LSTCM by comparing against baseline and state-of-the-art methods.