ReLU is one of the widely used activation functions. For any \(x > 0\), the output of ReLU is \(x\) and \(0\) otherwise. So,\ We can also write it as \(ReLU(x) = max(0, x)\). For rest of the post let’s say \(f(x) = ReLU(x)\)....

Mar 12, 2022

Skip to content ### Search Site

# Month: March 2022

ReLU is one of the widely used activation functions. For any \(x > 0\), the output of ReLU is \(x\) and \(0\) otherwise. So,\ We can also write it as \(ReLU(x) = max(0, x)\). For rest of the post let’s say \(f(x) = ReLU(x)\)....

Scroll to top