Handwritten Mathematical Expression Recognition via Attention Aggregation based Bi-directional Mutual Learning

Handwritten mathematical expression recognition aims to automaticallygenerate LaTeX sequences from given images. Currently, attention-basedencoder-decoder models are widely used in this task. They typically generatetarget sequences in a left-to-right (L2R) manner, leaving the right-to-left(R2L) contexts unexploited. In this paper, we propose an Attention aggregationbased Bi-directional Mutual learning Network (ABM) which consists of one sharedencoder and two parallel inverse decoders (L2R and R2L). The two decoders areenhanced via mutual distillation, which involves one-to-one knowledge transferat each training step, making full use of the complementary information fromtwo inverse directions. Moreover, in order to deal with mathematical symbols indiverse scales, an Attention Aggregation Module (AAM) is proposed toeffectively integrate multi-scale coverage attentions. Notably, in theinference phase, given that the model already learns knowledge from two inversedirections, we only use the L2R branch for inference, keeping the originalparameter size and inference speed. Extensive experiments demonstrate that ourproposed approach achieves the recognition accuracy of 56.85 % on CROHME 2014,52.92 % on CROHME 2016, and 53.96 % on CROHME 2019 without data augmentationand model ensembling, substantially outperforming the state-of-the-art methods.The source code is available in https://github.com/XH-B/ABM.