TAMER: Tree-Aware Transformer for Handwritten Mathematical Expression Recognition

Handwritten Mathematical Expression Recognition (HMER) has extensiveapplications in automated grading and office automation. However, existingsequence-based decoding methods, which directly predict $\LaTeX$ sequences,struggle to understand and model the inherent tree structure of $\LaTeX$ andoften fail to ensure syntactic correctness in the decoded results. To addressthese challenges, we propose a novel model named TAMER (Tree-Aware Transformer)for handwritten mathematical expression recognition. TAMER introduces aninnovative Tree-aware Module while maintaining the flexibility and efficienttraining of Transformer. TAMER combines the advantages of both sequencedecoding and tree decoding models by jointly optimizing sequence prediction andtree structure prediction tasks, which enhances the model's understanding andgeneralization of complex mathematical expression structures. During inference,TAMER employs a Tree Structure Prediction Scoring Mechanism to improve thestructural validity of the generated $\LaTeX$ sequences. Experimental resultson CROHME datasets demonstrate that TAMER outperforms traditional sequencedecoding and tree decoding models, especially in handling complex mathematicalstructures, achieving state-of-the-art (SOTA) performance.