RoBERTa WWM Ext (News+Factors) | 62.49 | 62.54 | 62.59 | 62.51 | RoBERTa: A Robustly Optimized BERT Pretraining Approach | |
Chinese Pert Large (News+Factors) | 67.37 | 67.27 | 67.28 | 67.73 | PERT: Pre-training BERT with Permuted Language Model | |
RoBERTa WWM Ext (News) | 61.34 | 61.48 | 61.97 | 61.32 | RoBERTa: A Robustly Optimized BERT Pretraining Approach | |
Chinese Pert Large (News) | 65.09 | 65.03 | 65.02 | 65.07 | PERT: Pre-training BERT with Permuted Language Model | |
Chinese Lert Large (News) | 64.37 | 64.30 | 64.34 | 64.31 | LERT: A Linguistically-motivated Pre-trained Language Model | |
Chinese Lert Large (News+Factors) | 66.36 | 66.16 | 66.40 | 66.69 | LERT: A Linguistically-motivated Pre-trained Language Model | |