Aesthetics Quality Assessment On Image
Metrics
Accuracy
MAE
Results
Performance results of various models on this benchmark
Comparison Table
Model Name | Accuracy | MAE |
---|---|---|
ordinalclip-learning-rank-prompts-for | 73.05 | 0.280 |
soft-labels-for-ordinal-regression | 72.03 | 0.290 |
learning-probabilistic-ordinal-embeddings-for | 72.44 | 0.287 |
a-constrained-deep-neural-network-for-ordinal | 70.05 | 0.316 |