megengine.core¶
"""
Users should never --
import megengine.core
But MegEngine developers should know what is inside.
"""
警告
我们不承诺 core 模块中 API 的兼容性和稳定性。
tesnor¶
dtype¶
Store metadata for quantize dtype. |
Get quantized dtype with metadata attribute according to _metadata_dict. |
|
Consturct a quantized unsigned int8 data type with |
|
Construct a quantized int8 data type with |
|
Construct a quantized int32 data type with |
|
Consturct a quantized unsigned int4 data type with |
|
Construct a quantized int4 data type with |
|
Quantize a float NumPy ndarray into a quint8 one with specified params. |
|
Quantize a float NumPy ndarray into a qint8 one with specified params. |
|
Dequantize a qint8 NumPy ndarray into a float one. |
|
Quantize a float NumPy ndarray into a qint32 one with specified params. |
|
Dequantize a qint32 NumPy ndarray into a float one. |
|
Quantize a float NumPy ndarray into a quint4 one with specified params. |
|
Dequantize a quint4 NumPy ndarray into a float one. |
|
Quantize a float NumPy ndarray into a qint4 one with specified params. |
|
Dequantize a qint4 NumPy ndarray into a float one. |
|
Dequantize a quint8 NumPy ndarray into a float one. |
indexing¶
mgebrain_graph¶
For all oprs in the subgraph constructed by dest_vars, sets its priority to id if its original priority is zero. |
|
Applies optimize_for_inference pass for computing graph. |
|
C++ graph version of |
|
serialize the computing graph of output_vars and get byte result. |
|
Load a serialized computing graph from file. |
|