megengine.functional.nn.binary_cross_entropy¶
- binary_cross_entropy(pred, label, with_logits=True, reduction='mean')[源代码]¶
Computes the binary cross entropy loss (using logits by default).
By default(
with_logitis
is True),pred
is assumed to be logits, class probabilities are given by sigmoid.- 参数
- 返回类型
- 返回
loss value.
实际案例
import numpy as np from megengine import tensor import megengine.functional as F pred = tensor(np.array([0, 0], dtype=np.float32).reshape(1, 2)) label = tensor(np.ones((1, 2), dtype=np.float32)) loss = F.nn.binary_cross_entropy(pred, label) print(loss.numpy().round(decimals=4))
Outputs:
0.6931