megengine.module

>>> import megengine.module as M

See also

Float Module

Containers

Module

Base Module class.

Sequential

A sequential container.

General operations

Elemwise

A Module to do elemwise operator.

Concat

A Module to do functional concat.

Convolution Layers

Conv1d

Applies a 1D convolution over an input tensor.

Conv2d

Applies a 2D convolution over an input tensor.

Conv3d

Applies a 3D convolution over an input tensor.

ConvTranspose2d

Applies a 2D transposed convolution over an input tensor.

ConvTranspose3d

Applies a 3D transposed convolution over an input tensor.

LocalConv2d

Applies a spatial convolution with untied kernels over an groupped channeled input 4D tensor.

DeformableConv2d

Deformable Convolution.

SlidingWindow

Apply a sliding window to input tensor and copy content in the window to corresponding output location.

SlidingWindowTranspose

Opposite opration of SlidingWindow, sum over the sliding windows on the corresponding input location.

Pooling layers

AvgPool2d

Applies a 2D average pooling over an input.

MaxPool2d

Applies a 2D max pooling over an input.

AdaptiveAvgPool2d

Applies a 2D average pooling over an input.

AdaptiveMaxPool2d

Applies a 2D max adaptive pooling over an input.

DeformablePSROIPooling

Padding layers

Pad

Pads the input tensor.

Non-linear Activations

Sigmoid

Applies the element-wise function:

Softmax

Applies a softmax function.

ReLU

Applies the rectified linear unit function element-wise:

LeakyReLU

Applies the element-wise function:

PReLU

Applies the element-wise function:

SiLU

Applies the element-wise function:

GELU

Applies the element-wise function:

Normalization Layers

BatchNorm1d

Applies Batch Normalization over a 2D or 3D input.

BatchNorm2d

Applies Batch Normalization over a 4D tensor.

SyncBatchNorm

Applies Synchronized Batch Normalization for distributed training.

GroupNorm

Applies Group Normalization over a mini-batch of inputs Refer to Group Normalization

InstanceNorm

Applies Instance Normalization over a mini-batch of inputs Refer to Instance Normalization

LayerNorm

Applies Layer Normalization over a mini-batch of inputs Refer to Layer Normalization

LocalResponseNorm

Apply local response normalization to the input tensor.

Recurrent Layers

RNN

Applies a multi-layer Elman RNN with \(\tanh\) or \(\text{ReLU}\) non-linearity to an input sequence.

RNNCell

An Elman RNN cell with tanh or ReLU non-linearity.

LSTM

Applies a multi-layer long short-term memory LSTM to an input sequence.

LSTMCell

A long short-term memory (LSTM) cell.

Linear Layers

Identity

A placeholder identity operator that will ignore any argument.

Linear

Applies a linear transformation to the input.

Dropout Layers

Dropout

Randomly sets some elements of inputs to zeros with the probability \(drop\_prob\) during training.

Sparse Layers

Embedding

A simple lookup table that stores embeddings of a fixed dictionary and size.

Vision Layers

PixelShuffle

Rearranges elements in a tensor of shape (, C x r^2, H, W) to a tensor of shape (, C, H x r, W x r), where r is an upscale factor, where * is zero or more batch dimensions.

Fused operations

ConvRelu2d

A fused Module including Conv2d and relu.

ConvBn2d

A fused Module including Conv2d and BatchNorm2d.

ConvBnRelu2d

A fused Module including Conv2d, BatchNorm2d and relu.

BatchMatMulActivation

Batched matmul with activation(only relu supported), no transpose anywhere.

Quantization

QuantStub

A helper Module simply returning input.

DequantStub

A helper Module simply returning input.

QAT Module

Containers

QATModule

Base class of quantized-float related Module, basically for QAT and Calibration.

Operations

Linear

A QATModule version of Linear.

Elemwise

A QATModule to do elemwise operator with QAT support.

Concat

A QATModule to do functional concat with QAT support.

Conv2d

A QATModule Conv2d with QAT support.

ConvRelu2d

A QATModule include Conv2d and relu with QAT support.

ConvBn2d

A fused QATModule including Conv2d and BatchNorm2d with QAT support.

ConvBnRelu2d

A fused QATModule including Conv2d, BatchNorm2d and relu with QAT support.

ConvTranspose2d

A QATModule ConvTranspose2d with QAT support.

BatchMatMulActivation

A QATModule BatchMatMulActivation with QAT support.

QuantStub

A helper QATModule simply return input, but will quantize input after converted to QuantizedModule.

DequantStub

A helper QATModule simply return input, but will de-quantize input after converted to QuantizedModule.

Quantized Module

QuantizedModule

Base class of quantized Module, which should be converted from QATModule and not support traning.

Operations

Linear

Quantized version of Linear.

Elemwise

Quantized version of Elemwise.

Concat

A QuantizedModule to do quantized concat, used for inference only.

Conv2d

Quantized version of Conv2d.

ConvRelu2d

Quantized version of ConvRelu2d.

ConvBn2d

Quantized version of ConvBn2d.

ConvBnRelu2d

Quantized version of ConvBnRelu2d.

ConvTranspose2d

Quantized version of ConvTranspose2d.

BatchMatMulActivation

Quantized version of BatchMatMulActivation.

QuantStub

Quantized version of QuantStub, will convert input to quantized dtype.

DequantStub

Quantized version of DequantStub, will restore quantized input to float32 dtype.

External Layers

ExternOprSubgraph

Load a serialized ExternOpr subgraph.

TensorrtRuntimeSubgraph

Load a serialized TensorrtRuntime subgraph.

CambriconRuntimeSubgraph

Load a serialized CambriconRuntime subgraph.

AtlasRuntimeSubgraph

Load a serialized AtlasRuntime subgraph.

MagicMindRuntimeSubgraph

Load a serialized MagicMindRuntime subgraph.