RNNCell

class RNNCell(input_size, hidden_size, bias=True, nonlinearity='tanh')[source]

An Elman RNN cell with tanh or ReLU non-linearity.

\[h' = \tanh(W_{ih} x + b_{ih} + W_{hh} h + b_{hh})\]

If nonlinearity is ‘relu’, then ReLU is used in place of tanh.

Parameters
  • input_size (int) – The number of expected features in the input x.

  • hidden_size (int) – The number of features in the hidden state h.

  • bias (bool) – If False, then the layer does not use bias weights b_ih and b_hh. Default: True.

  • nonlinearity (str) – The non-linearity to use. Can be either 'tanh' or 'relu'. Default: 'tanh'

Shape:
  • Inputs: input, hidden

    input: (batch, input_size). Tensor containing input features. hidden: (batch, hidden_size). Tensor containing the initial hidden state for each element in the batch. Defaults to zero if not provided.

  • Outputs: h’

    h’: (batch, hidden_size). Tensor containing the next hidden state for each element in the batch.

Examples

import numpy as np
import megengine as mge
import megengine.module as M

m = M.RNNCell(10, 20)
inp = mge.tensor(np.random.randn(3, 10), dtype=np.float32)
hx = mge.tensor(np.random.randn(3, 20), dtype=np.float32)
out = m(inp, hx)
print(out.numpy().shape)

Outputs:

(3, 20)