Get a model for MegEngine Lite inference

The dynamic graph used in MegEngine training is used for training. After the model training is completed, the dynamic graph needs to be converted into a static graph before inference can be performed in MegEngine Lite. There are currently two ways to switch from a trained model to an inference model:

  • Use trace_module mode:to convert the dynamic graph to traced_module IR through MegEngine Traced Module, on the basis of this IR, you can perform graph surgery, etc., refer to TracedModule 常见图手术, and finally convert it to run on MegEngine Lite static graph model.

  • Direct dump mode:converts dynamic graphs to static graphs by using MegEngine’s trace and dump functions.

as shown below:

graph LR training_code[训练代码] ==> |tm.trace_module| tm_file[.tm 文件] training_code .-> |dump| mge_file tm_file ==> |dump| mge_file[.mge 文件] mge_file ==> |load| litepy[Lite Python 运行时] mge_file ==> |load| lite[Lite C++ 运行时]

Use trace_module method

Referring to 快速上手 TracedModule, the following is an example of converting the already trained resnet18 to trace_module for model center, and then dumping it into a static model that MegEngine Lite can load.

import numpy as np
import megengine.functional as F
import megengine.module as M
import megengine as mge
import megengine.traced_module as tm
from megengine import jit, tensor

# 用户需要将这里的模型替换为自己已经训练好的模型
resnet = mge.hub.load("megengine/models", "resnet18", pretrained=True)

data = mge.Tensor(np.random.random([1, 3, 224, 224]).astype(np.float32))

traced_resnet = tm.trace_module(resnet, data)
# 可以在这里进行基于 trace_module 的图手术,以及模型转换
traced_resnet.eval()

@jit.trace(symbolic=True, capture_as_const=True)
def fun(data, *, net):
   pred = net(data)
   pred_normalized = F.softmax(pred)
   return pred_normalized

fun(data, net=traced_resnet)
fun.dump("resnet18.mge", arg_names=["data"])

The above code completes the following step:

  • First, by downloading the pre-trained model of resnet18 from `Model Center <https://megengine.org.cn/model-hub>of MegEngine, Users can replace with their own pre-trained model.

  • Convert resnet18 to the model traced_resnet of trace_module, users can do some graph surgery and model conversion in `` :ref:, refer to graphsurgeon-example for graph surgery, and mgeconvert <https://github.com/megengine/mgeconvert>for model conversion `_ , The above example does not do any graph surgery and model transformation.

  • Serialize the traced_resnet model into the file resnet18.mge with trace and dump.

Note

If you need to dump your own model instead of the model center model, you can load and serialize the trained model through Save and Load Models (S&L) in MegEngine, and then replace the above resnet ie Can.

Direct dump method

Compared with the trace_module method above, the direct dump process only has less process of converting to trace_module, omitting this process will sacrifice the ability to perform graph operations and model conversion on the model, refer to the following example.

import numpy as np
import megengine.functional as F
import megengine.hub
from megengine import jit, tensor

if __name__ == "__main__":

   # 这里需要替换为自己训练的模型,或者 trace_module 之后的模型。
    net = megengine.hub.load("megengine/models", "shufflenet_v2_x1_0", pretrained=True)
    net.eval()

    @jit.trace(symbolic=True, capture_as_const=True)
    def fun(data, *, net):
        pred = net(data)
        pred_normalized = F.softmax(pred)
        return pred_normalized

    data = tensor(np.random.random([1, 3, 224, 224]).astype(np.float32))

    fun(data, net=net)
    fun.dump("shufflenet_v2.mge", arg_names=["data"])

The above code will download the shufflenet_v2_x1_0 model from model center and perform trace and dump to complete the conversion from the dynamic graph model to the static graph model.

Note

Similarly, if you need to dump your own model instead of the model of model center, you can load the trained model through Save and Load Models (S&L) in MegEngine, or use the method in Traced Module to get model, and then replace the net above.