HyperAIHyperAI

Command Palette

Search for a command to run...

Evaluating the Performance of TAAF for image classification models

Bryn T Chatfield

Abstract

In this paper, we present the results of testing a custom activation function, The Analog Activation Function (TAAF), on both the MNIST and CIFAR-10 datasets. TAAF is a novel activation function designed to improve the performance of neural networks by leveraging a unique mathematical formulation. We evaluate TAAF in a convolutional neural network (CNN) architecture and compare its performance against standard activation functions on MNIST and against ELU on CIFAR-10. Our results demonstrate that TAAF achieves a test accuracy of 99.39% on the MNIST dataset and 79.37% on the CIFAR-10 dataset. On MNIST, TAAF achieves a slightly higher test accuracy of 99.39%, surpassing standard activation functions. On CIFAR-10, TAAF achieves a significantly higher test accuracy of 79.37% compared to ELU's 72.06% in the same architecture, suggesting improved generalization capabilities. This paper establishes a solid performance baseline for TAAF across different image classification tasks.


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp