Question Answer Categorization On Qc Science
評価指標
R@10
R@15
R@20
R@5
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
モデル名 | R@10 | R@15 | R@20 | R@5 | Paper Title | Repository |
---|---|---|---|---|---|---|
TagRec(BERT+USE) | 0.92 | 0.95 | 0.96 | 0.86 | TagRec: Automated Tagging of Questions with Hierarchical Learning Taxonomy | |
BERT+GloVe | 0.87 | 0.92 | 0.94 | 0.76 | TagRec: Automated Tagging of Questions with Hierarchical Learning Taxonomy | |
Pretrained Sent BERT | 0.40 | 0.47 | 0.52 | 0.30 | TagRec: Automated Tagging of Questions with Hierarchical Learning Taxonomy | |
TagRec(BERT+Sent BERT) | 0.93 | 0.95 | 0.97 | 0.85 | TagRec: Automated Tagging of Questions with Hierarchical Learning Taxonomy | |
Twin BERT | 0.86 | 0.91 | 0.94 | 0.72 | TagRec: Automated Tagging of Questions with Hierarchical Learning Taxonomy | |
BERT+sent2vec | 0.89 | 0.93 | 0.95 | 0.79 | TagRec: Automated Tagging of Questions with Hierarchical Learning Taxonomy |
0 of 6 row(s) selected.