Softmax¶
- class Softmax(axis=None, **kwargs)[source]¶
Applies a softmax function. Softmax is defined as:
\[\text{Softmax}(x_{i}) = \frac{exp(x_i)}{\sum_j exp(x_j)}\]It is applied to all elements along axis, and rescales elements so that they stay in the range [0, 1] and sum to 1.
- Parameters
axis – Along which axis softmax will be applied. By default, softmax will be applyed along the highest ranked axis.
- Shape:
Input: \((*)\) where * means, any number of additional dimensions
Output: \((*)\), same shape as the input
Examples
>>> import numpy as np >>> data = mge.tensor(np.array([-2,-1,0,1,2]).astype(np.float32)) >>> softmax = M.Softmax() >>> output = softmax(data) >>> with np.printoptions(precision=6): ... print(output.numpy()) [0.011656 0.031685 0.086129 0.234122 0.636409]