HyperAIHyperAI

Command Palette

Search for a command to run...

Handwritten Mathematical Expression Recognition with Bidirectionally Trained Transformer

Zhao Wenqi ; Gao Liangcai ; Yan Zuoyu ; Peng Shuai ; Du Lin ; Zhang Ziyin

Abstract

Encoder-decoder models have made great progress on handwritten mathematicalexpression recognition recently. However, it is still a challenge for existingmethods to assign attention to image features accurately. Moreover, thoseencoder-decoder models usually adopt RNN-based models in their decoder part,which makes them inefficient in processing long LaTeX\LaTeX{}LATEX sequences. In thispaper, a transformer-based decoder is employed to replace RNN-based ones, whichmakes the whole model architecture very concise. Furthermore, a novel trainingstrategy is introduced to fully exploit the potential of the transformer inbidirectional language modeling. Compared to several methods that do not usedata augmentation, experiments demonstrate that our model improves the ExpRateof current state-of-the-art methods on CROHME 2014 by 2.23%. Similarly, onCROHME 2016 and CROHME 2019, we improve the ExpRate by 1.92% and 2.28%respectively.


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp