Posted on Mar 12, 2022

## Why ReLU is not differentiable at x=0 (zero)?

Posted in Machine Learning

ReLU is one of the widely used activation functions. For any $$x > 0$$, the output of ReLU is $$x$$ and $$0$$ otherwise. So,\ We can also write it as $$ReLU(x) = max(0, x)$$. For rest of the post let’s say $$f(x) = ReLU(x)$$....

Scroll to top