HyperAI
الرئيسية
الأخبار
أحدث الأوراق البحثية
الدروس
مجموعات البيانات
الموسوعة
SOTA
نماذج LLM
لوحة الأداء GPU
الفعاليات
البحث
حول
العربية
HyperAI
Toggle sidebar
البحث في الموقع...
⌘
K
الرئيسية
SOTA
Chinese Named Entity Recognition
Chinese Named Entity Recognition On Msra
Chinese Named Entity Recognition On Msra
المقاييس
F1
النتائج
نتائج أداء النماذج المختلفة على هذا المعيار القياسي
Columns
اسم النموذج
F1
Paper Title
Repository
ERNIE 2.0 Base
93.8
ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
NFLAT
94.55
NFLAT: Non-Flat-Lattice Transformer for Chinese Named Entity Recognition
BERT-MRC
95.75
A Unified MRC Framework for Named Entity Recognition
FLAT+BERT
96.09
FLAT: Chinese NER Using Flat-Lattice Transformer
DiffusionNER
94.91
DiffusionNER: Boundary Diffusion for Named Entity Recognition
LSTM + Lexicon augment
93.5
Simplify the Usage of Lexicon in Chinese NER
FLAT
94.12
FLAT: Chinese NER Using Flat-Lattice Transformer
Glyce + BERT
95.54
Glyce: Glyph-vectors for Chinese Character Representations
FGN
95.64
FGN: Fusion Glyph Network for Chinese Named Entity Recognition
ZEN (Init with Chinese BERT)
95.25
ZEN: Pre-training Chinese Text Encoder Enhanced by N-gram Representations
ERNIE 2.0 Large
95
ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
ZEN (Random Init)
93.24
ZEN: Pre-training Chinese Text Encoder Enhanced by N-gram Representations
Baseline + BS
96.26
Boundary Smoothing for Named Entity Recognition
BERT-CRF (Replicated in AdaSeq)
96.69
Improving Named Entity Recognition by External Context Retrieving and Cooperative Learning
TENER
92.74
TENER: Adapting Transformer Encoder for Named Entity Recognition
-
Lattice
93.18
Chinese NER Using Lattice LSTM
W2NER
96.10
Unified Named Entity Recognition as Word-Word Relation Classification
PIQN
93.48
Parallel Instance Query Network for Named Entity Recognition
BERT-MRC+DSC
96.72
Dice Loss for Data-imbalanced NLP Tasks
CAN-NER Model
92.97
CAN-NER: Convolutional Attention Network for Chinese Named Entity Recognition
0 of 21 row(s) selected.
Previous
Next