Softmax#

class Softmax(axis=None, **kwargs)[源代码]#

应用一个softmax函数。SoftMax定义为:

\[\text{Softmax}(x_{i}) = \frac{exp(x_i)}{\sum_j exp(x_j)}\]

应用softmax于一个n维输入张量,并重新放缩张量中的元素值,使得n维输出张量中所有元素的取值范围为 [0,1] 并且加和为1。

参数:

axis – Along which axis softmax will be applied. By default, softmax will apply along the highest ranked axis.

实际案例

>>> import numpy as np
>>> data = mge.tensor(np.array([-2,-1,0,1,2]).astype(np.float32))
>>> softmax = M.Softmax()
>>> output = softmax(data)
>>> with np.printoptions(precision=6):
...     print(output.numpy())
[0.011656 0.031685 0.086129 0.234122 0.636409]