Coreference Resolution On Gap 1
Métriques
Bias (F/M)
Feminine F1 (F)
Masculine F1 (M)
Overall F1
Résultats
Résultats de performance de divers modèles sur ce benchmark
Nom du modèle | Bias (F/M) | Feminine F1 (F) | Masculine F1 (M) | Overall F1 | Paper Title | Repository |
---|---|---|---|---|---|---|
Coref-MTL | 0.99 | 92.45 | 92.65 | 92.72 | - | - |
PeTra | - | - | - | - | PeTra: A Sparsely Supervised Memory Model for People Tracking | |
ProBERT | 0.97 | 91.1 | 94.0 | 92.5 | Gendered Ambiguous Pronouns Shared Task: Boosting Model Confidence by Evidence Pooling | |
Maverick_incr | - | - | - | 91.2 | Maverick: Efficient and Accurate Coreference Resolution Defying Recent Trends | |
Full Ensemble | 0.98 | 89.5 | 90.9 | 90.2 | Gendered Pronoun Resolution using BERT and an extractive question answering formulation |
0 of 5 row(s) selected.