HyperAI

GPU Leaderboard

Compare performance metrics for AI workloads across different NVIDIA GPUs
GPU Performance Visualization
GPU Comparison
NVIDIA Tesla V100 PCIe 16 GB
Metric
NVIDIA Tesla V100 SXM2 16 GB
NVIDIA Volta
Architecture
NVIDIA Volta
2017.06
Release Date
2019.11
16 GB HBM2
Memory
16 GB HBM2
Not supported
FP8 Performance
Not supported
28.26 TFLOPS
FP16 Performance
31.33 TFLOPS
14.13 TFLOPS
FP32 Performance
15.67 TFLOPS
7.07 TFLOPS
FP64 Performance
7.83 TFLOPS
5120
CUDA Cores
5120
250 W
Power Consumption
300 W

Performance Summary

FP8 Performance: Not available for comparison
FP16 Performance: 9.80% slower (NVIDIA Tesla V100 SXM2 16 GB advantage)
FP32 Performance: 9.83% slower (NVIDIA Tesla V100 SXM2 16 GB advantage)
FP64 Performance: 9.71% slower (NVIDIA Tesla V100 SXM2 16 GB advantage)
GPU Benchmark Comparison
GPU Model
Release Date
Architecture
CUDA Cores
Memory
TGP
FP8
FP16
FP32
FP64
NVIDIA Tesla V100 PCIe 16 GB2017.06NVIDIA Volta512016 GB HBM2250 WNot supported28.26 TFLOPS14.13 TFLOPS7.07 TFLOPS
NVIDIA Tesla V100 SXM2 16 GB2019.11NVIDIA Volta512016 GB HBM2300 WNot supported31.33 TFLOPS15.67 TFLOPS7.83 TFLOPS
NVIDIA Tesla V100 PCIe 32 GB2018.03NVIDIA Volta512032 GB HBM2250 WNot supported28.26 TFLOPS14.13 TFLOPS7.07 TFLOPS
NVIDIA Tesla V100 SXM2 32 GB2018.03NVIDIA Volta512032 GB HBM2300 WNot supported31.33 TFLOPS15.67 TFLOPS7.83 TFLOPS
NVIDIA GeForce RTX 30902020.09NVIDIA Ampere1049624 GB GDDR6X350 WNot supported35.58 TFLOPS35.58 TFLOPS0.56 TFLOPS
NVIDIA GeForce RTX 40902022.10NVIDIA Ada Lovelace1638424 GB GDDR6X450 WNot supported82.58 TFLOPS82.58 TFLOPS1.29 TFLOPS
NVIDIA GeForce RTX 50902025.01NVIDIA Blackwell2176032 GB GDDR7575 WNot supported104.8 TFLOPS104.8 TFLOPS1.64 TFLOPS
NVIDIA A40 PCIe2020.10NVIDIA Ampere1075248 GB GDDR6300 WNot supported37.42 TFLOPS37.42 TFLOPS0.58 TFLOPS
NVIDIA RTX A60002021.03NVIDIA Ampere1075248 GB GDDR6300 WNot supported38.71 TFLOPS38.71 TFLOPS0.60 TFLOPS
NVIDIA A100 PCIe 40 GB2020.06NVIDIA Ampere691240 GB HBM2e250 WNot supported77.97 TFLOPS19.49 TFLOPS9.75 TFLOPS
NVIDIA A100 SXM4 40 GB2020.05NVIDIA Ampere691240 GB HBM2e400 WNot supported77.97 TFLOPS19.49 TFLOPS9.75 TFLOPS
NVIDIA A100 PCIe 80 GB2021.06NVIDIA Ampere691280 GB HBM2e300 WNot supported77.97 TFLOPS19.49 TFLOPS9.75 TFLOPS
NVIDIA A100 SXM4 80 GB2020.11NVIDIA Ampere691280 GB HBM2e400 WNot supported77.97 TFLOPS19.49 TFLOPS9.75 TFLOPS
NVIDIA L402022.10NVIDIA Ada Lovelace1408048 GB GDDR6300 WNot supported90.52 TFLOPS90.52 TFLOPS1.41 TFLOPS
NVIDIA L40s2023.08NVIDIA Ada Lovelace1408048 GB GDDR6350 WNot supported91.61 TFLOPS91.61 TFLOPS1.43 TFLOPS
NVIDIA H100 PCIe 80 GB2023.03NVIDIA Hopper1689680 GB HBM3350 W3026 TFLOPS204.9 TFLOPS51.22 TFLOPS25.61 TFLOPS
NVIDIA H100 SXM5 80 GB2023.03NVIDIA Hopper1689680 GB HBM3700 W3958 TFLOPS267.6 TFLOPS66.91 TFLOPS33.45 TFLOPS
NVIDIA H100 PCIe 96 GB2023.03NVIDIA Hopper1689696 GB HBM3350 W3026 TFLOPS248.3 TFLOPS62.08 TFLOPS31.04 TFLOPS
NVIDIA H100 SXM5 96 GB2023.03NVIDIA Hopper1689696 GB HBM3700 W3026 TFLOPS248.3 TFLOPS66.91 TFLOPS31.04 TFLOPS
NVIDIA H200 NVL2024.11NVIDIA Hopper16896141 GB HBM3600 W3026 TFLOPS241.3 TFLOPS60.32 TFLOPS30.16 TFLOPS
NVIDIA H200 SXM 141 GB2024.11NVIDIA Hopper16896141 GB HBM3700 W3958 TFLOPS267.6 TFLOPS66.91 TFLOPS33.45 TFLOPS