HyperAI
Startseite
Neuigkeiten
Neueste Forschungsarbeiten
Tutorials
Datensätze
Wiki
SOTA
LLM-Modelle
GPU-Rangliste
Veranstaltungen
Suche
Über
Deutsch
HyperAI
Toggle sidebar
Seite durchsuchen…
⌘
K
Startseite
SOTA
Network Pruning
Network Pruning On Imagenet
Network Pruning On Imagenet
Metriken
Accuracy
Ergebnisse
Leistungsergebnisse verschiedener Modelle zu diesem Benchmark
Columns
Modellname
Accuracy
Paper Title
Repository
MobileNetV1-50% FLOPs
70.7
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
ResNet50-2.3 GFLOPs
78.79
Pruning Filters for Efficient ConvNets
ResNet50-1G FLOPs
74.2
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
ResNet50-1G FLOPs
74.2
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
ResNet50-1.5 GFLOPs
78.07
Pruning Filters for Efficient ConvNets
TAS-pruned ResNet-50
76.20
Network Pruning via Transformable Architecture Search
ResNet50
73.14
AC/DC: Alternating Compressed/DeCompressed Training of Deep Neural Networks
ResNet50-2G FLOPs
76.4
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
ResNet50-1G FLOPs
76.376
Pruning Filters for Efficient ConvNets
SqueezeNet (6-bit Deep Compression)
57.5%
SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size
RegX-1.6G
77.97
Group Fisher Pruning for Practical Network Compression
ResNet50-3G FLOPs
77.1
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning
ResNet50 2.0 GFLOPS
77.70
Knapsack Pruning with Inner Distillation
ResNet50 2.5 GFLOPS
78.0
Knapsack Pruning with Inner Distillation
ResNet50
75.59
Network Pruning That Matters: A Case Study on Retraining Variants
MobileNetV2
73.42
Group Fisher Pruning for Practical Network Compression
0 of 16 row(s) selected.
Previous
Next