Skip to content
#

activation-function-exploration

Here are 2 public repositories matching this topic...

This is a custom-built neural network that detects handwritten numbers from image inputs. It uses ReLU activation in the hidden layers and a softmax activation function in the output layer for classification. The model is trained using backpropagation with a loss function to minimize prediction errors, achieving over 99% accuracy when predicting di

  • Updated Apr 11, 2025
  • Python

Improve this page

Add a description, image, and links to the activation-function-exploration topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the activation-function-exploration topic, visit your repo's landing page and select "manage topics."

Learn more