ReLU
Rectified linear unit: max(0, x). The standard non-linearity inside modern neural networks because it's cheap and avoids vanishing gradients.
Continue
Rectified linear unit: max(0, x). The standard non-linearity inside modern neural networks because it's cheap and avoids vanishing gradients.
Continue