Activation Functions


Posted: 07 Jul 2025. Last modified on 07-Jul-25.

This article will take about 2 minutes to read.



Activation functions are mathematical operations that determine the output of a neural network node, introducing non-linearity and enabling neural networks to learn complex patterns.

Wikipedia has a list of common activation functions.

I’ve taken some notes on the activation functions, along with their equations and key characteristics:

Sigmoid (Logistic)

Hyperbolic Tangent (tanh)

Rectified Linear Unit (ReLU)

Leaky ReLU

Parametric ReLU (PReLU)

Exponential Linear Unit (ELU)

Softmax

Swish

Gaussian Error Linear Unit (GELU)

Linear (Identity)

Softplus

Scaled Exponential Linear Unit (SELU)


Usage Notes: