Command Palette
Search for a command to run...
Language Modelling On Wiki 40B
評価指標
Perplexity
評価結果
このベンチマークにおける各モデルのパフォーマンス結果
| Paper Title | ||
|---|---|---|
| Combiner-Fixed-8k | 16.60 | Combiner: Full Attention Transformer with Sparse Computation Cost |
| Combiner-Axial-8k | 16.49 | Combiner: Full Attention Transformer with Sparse Computation Cost |
| FLASH-Quad-8k | 14.998 | Transformer Quality in Linear Time |
0 of 3 row(s) selected.