ReLU

class ReLU(name=None)[source]

Applies the rectified linear unit function element-wise:

\[\text{ReLU}(x) = (x)^+ = \max(x, 0)\]

Examples

>>> import numpy as np
>>> data = mge.tensor(np.array([-2,-1,0,1,2,]).astype(np.float32))
>>> relu = M.ReLU()
>>> output = relu(data)
>>> with np.printoptions(precision=6):
...     print(output.numpy())
[0. 0. 0. 1. 2.]