HyperAI

Sentiment Analysis On Sst 2 Binary

Métriques

Accuracy

Résultats

Résultats de performance de divers modèles sur ce benchmark

Nom du modèle
Accuracy
Paper TitleRepository
CNN91.2On the Role of Text Preprocessing in Neural Network Architectures: An Evaluation Study on Text Categorization and Sentiment Analysis
SpanBERT94.8SpanBERT: Improving Pre-training by Representing and Predicting Spans
MV-RNN82.9--
gMLP-large94.8Pay Attention to MLPs
C-LSTM87.8A C-LSTM Neural Network for Text Classification
ASA + BERT-base94.1Adversarial Self-Attention for Language Understanding
Emo2Vec81.2Emo2Vec: Learning Generalized Emotion Representation by Multi-task Training
ASA + RoBERTa96.3Adversarial Self-Attention for Language Understanding
Single layer bilstm distilled from BERT90.7Distilling Task-Specific Knowledge from BERT into Simple Neural Networks
SMART+BERT-BASE93SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization
USE_T+CNN (lrn w.e.) 87.21Universal Sentence Encoder
MPAD-path87.75Message Passing Attention Networks for Document Understanding-
CNN-multichannel [kim2013]88.1Convolutional Neural Networks for Sentence Classification
T5-11B97.5Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
24hBERT93.0How to Train BERT with an Academic Budget
RealFormer94.04RealFormer: Transformer Likes Residual Attention
T5-3B97.4Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
T5-Large 770M96.3Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
2-layer LSTM [tai2015improved]86.3Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks
BLSTM-2DCNN89.5Text Classification Improved by Integrating Bidirectional LSTM with Two-dimensional Max Pooling
0 of 88 row(s) selected.