megengine.functional.nn.binary_cross_entropy

binary_cross_entropy(pred, label, with_logits=True, reduction='mean')[源代码]

Computes the binary cross entropy loss (using logits by default).

By default(with_logitis is True), pred is assumed to be logits, class probabilities are given by sigmoid.

参数
  • pred (Tensor) – (N, *), where * means any number of additional dimensions.

  • label (Tensor) – (N, *), same shape as the input.

  • with_logits (bool) – bool, whether to apply sigmoid first. Default: True

  • reduction (str) – the reduction to apply to the output: ‘none’ | ‘mean’ | ‘sum’. Default: ‘mean’

返回类型

Tensor

返回

loss value.

实际案例

import numpy as np
from megengine import tensor
import megengine.functional as F

pred = tensor(np.array([0, 0], dtype=np.float32).reshape(1, 2))
label = tensor(np.ones((1, 2), dtype=np.float32))
loss = F.nn.binary_cross_entropy(pred, label)
print(loss.numpy().round(decimals=4))

Outputs:

0.6931