A Benchmark for Activation Function Exploration for Neural Architecture Search (NAS)
-
Updated
Mar 11, 2025 - Python
A Benchmark for Activation Function Exploration for Neural Architecture Search (NAS)
This is a custom-built neural network that detects handwritten numbers from image inputs. It uses ReLU activation in the hidden layers and a softmax activation function in the output layer for classification. The model is trained using backpropagation with a loss function to minimize prediction errors, achieving over 99% accuracy when predicting di
Add a description, image, and links to the activation-function-exploration topic page so that developers can more easily learn about it.
To associate your repository with the activation-function-exploration topic, visit your repo's landing page and select "manage topics."