Activation Functions

Non‑linear functions (e.g., ReLU, Sigmoid, Tanh, GELU) applied to neuron outputs so neural networks can model complex patterns.