Components

Activation Function

Non-linear functions that determine neuron output

What is Activation Function?

Activation functions are mathematical equations that determine the output of a neural network node. They introduce non-linearity into the network, enabling it to learn complex patterns. Common activation functions include ReLU, sigmoid, and tanh.

Key Points

1

Introduces non-linearity

2

Determines neuron firing

3

Different types for different purposes

4

Critical for deep learning

Practical Examples

ReLU (Rectified Linear Unit)
Sigmoid
Tanh
Softmax for classification