HyperAIHyperAI
2 months ago

Beyond Attentive Tokens: Incorporating Token Importance and Diversity for Efficient Vision Transformers

Long, Sifan ; Zhao, Zhen ; Pi, Jimin ; Wang, Shengsheng ; Wang, Jingdong
Beyond Attentive Tokens: Incorporating Token Importance and Diversity
  for Efficient Vision Transformers
Abstract

Vision transformers have achieved significant improvements on various visiontasks but their quadratic interactions between tokens significantly reducecomputational efficiency. Many pruning methods have been proposed to removeredundant tokens for efficient vision transformers recently. However, existingstudies mainly focus on the token importance to preserve local attentive tokensbut completely ignore the global token diversity. In this paper, we emphasizethe cruciality of diverse global semantics and propose an efficient tokendecoupling and merging method that can jointly consider the token importanceand diversity for token pruning. According to the class token attention, wedecouple the attentive and inattentive tokens. In addition to preserving themost discriminative local tokens, we merge similar inattentive tokens and matchhomogeneous attentive tokens to maximize the token diversity. Despite itssimplicity, our method obtains a promising trade-off between model complexityand classification accuracy. On DeiT-S, our method reduces the FLOPs by 35%with only a 0.2% accuracy drop. Notably, benefiting from maintaining the tokendiversity, our method can even improve the accuracy of DeiT-T by 0.1% afterreducing its FLOPs by 40%.

Beyond Attentive Tokens: Incorporating Token Importance and Diversity for Efficient Vision Transformers | Latest Papers | HyperAI