HyperAI
Accueil
Actualités
Articles de recherche récents
Tutoriels
Ensembles de données
Wiki
SOTA
Modèles LLM
Classement GPU
Événements
Recherche
À propos
Français
HyperAI
Toggle sidebar
Rechercher sur le site...
⌘
K
Accueil
SOTA
Sentiment Analysis
Sentiment Analysis On Sst 2 Binary
Sentiment Analysis On Sst 2 Binary
Métriques
Accuracy
Résultats
Résultats de performance de divers modèles sur ce benchmark
Columns
Nom du modèle
Accuracy
Paper Title
Repository
CNN
91.2
On the Role of Text Preprocessing in Neural Network Architectures: An Evaluation Study on Text Categorization and Sentiment Analysis
SpanBERT
94.8
SpanBERT: Improving Pre-training by Representing and Predicting Spans
MV-RNN
82.9
-
-
gMLP-large
94.8
Pay Attention to MLPs
C-LSTM
87.8
A C-LSTM Neural Network for Text Classification
ASA + BERT-base
94.1
Adversarial Self-Attention for Language Understanding
Emo2Vec
81.2
Emo2Vec: Learning Generalized Emotion Representation by Multi-task Training
ASA + RoBERTa
96.3
Adversarial Self-Attention for Language Understanding
Single layer bilstm distilled from BERT
90.7
Distilling Task-Specific Knowledge from BERT into Simple Neural Networks
SMART+BERT-BASE
93
SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization
USE_T+CNN (lrn w.e.)
87.21
Universal Sentence Encoder
MPAD-path
87.75
Message Passing Attention Networks for Document Understanding
-
CNN-multichannel [kim2013]
88.1
Convolutional Neural Networks for Sentence Classification
T5-11B
97.5
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
24hBERT
93.0
How to Train BERT with an Academic Budget
RealFormer
94.04
RealFormer: Transformer Likes Residual Attention
T5-3B
97.4
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
T5-Large 770M
96.3
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
2-layer LSTM [tai2015improved]
86.3
Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks
BLSTM-2DCNN
89.5
Text Classification Improved by Integrating Bidirectional LSTM with Two-dimensional Max Pooling
0 of 88 row(s) selected.
Previous
Next