PReLU

class PReLU(num_parameters=1, init=0.25, **kwargs)[source]

Applies the element-wise function:

\[\text{PReLU}(x) = \max(0,x) + a * \min(0,x)\]

or

\[\begin{split}\text{PReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ ax, & \text{ otherwise } \end{cases}\end{split}\]

Here \(a\) is a learnable parameter. When called without arguments, PReLU() uses a single paramter \(a\) across all input channel. If called with PReLU(num_of_channels), each input channle will has it’s own \(a\).

Parameters
  • num_parameters (int) – number of \(a\) to learn, there is only two values are legitimate: 1, or the number of channels at input. Default: 1

  • init (float) – the initial value of \(a\). Default: 0.25

Examples

>>> import numpy as np
>>> data = mge.tensor(np.array([-1.2, -3.7, 2.7]).astype(np.float32))
>>> prelu = M.PReLU()
>>> output = prelu(data)
>>> output.numpy()
array([-0.3  , -0.925,  2.7  ], dtype=float32)