megengine.functional.nn.cross_entropy¶
- cross_entropy(pred, label, axis=1, with_logits=True, label_smooth=0, reduction='mean')[源代码]¶
计算 multi-class cross entropy loss(默认使用 logits)。
默认情况下(
with_logitis
为真),pred
被认为是 logits,类的概率将由 softmax 计算得出。与顺序调用
softmax
和cross_entropy
相比,具有更好的数值稳定性。当使用标签平滑 (label smoothing) 时,标签的分布情况如下:
\[y^{LS}_{k}=y_{k}\left(1-\alpha\right)+\alpha/K\]where \(y^{LS}\) and \(y\) are new label distribution and origin label distribution respectively. k is the index of label distribution. \(\alpha\) is
label_smooth
and \(K\) is the number of classes.- 参数
- 返回类型
- 返回
损失值。
实际案例
import numpy as np from megengine import tensor import megengine.functional as F data_shape = (1, 2) label_shape = (1, ) pred = tensor(np.array([0, 0], dtype=np.float32).reshape(data_shape)) label = tensor(np.ones(label_shape, dtype=np.int32)) loss = F.nn.cross_entropy(pred, label) print(loss.numpy().round(decimals=4))
输出:
0.6931