ReLU

class ReLU(name=None)[源代码]

对每个元素应用函数:

\[\text{ReLU}(x) = \max(x, 0)\]

实际案例

import numpy as np
import megengine as mge
import megengine.module as M
data = mge.tensor(np.array([-2,-1,0,1,2,]).astype(np.float32))
relu = M.ReLU()
output = relu(data)
with np.printoptions(precision=6):
    print(output.numpy())

输出:

[0. 0. 0. 1. 2.]