megengine.functional.nn.binary_cross_entropy¶
- binary_cross_entropy(pred, label, with_logits=True, reduction='mean')[source]¶
Computes the binary cross entropy loss (using logits by default).
- Parameters
- Return type
- Returns
loss value.
Examples
By default(
with_logitis
is True),pred
is assumed to be logits, class probabilities are given by softmax. It has better numerical stability compared with sequential calls tosigmoid
andbinary_cross_entropy
.>>> pred = Tensor([0.9, 0.7, 0.3]) >>> label = Tensor([1., 1., 1.]) >>> F.nn.binary_cross_entropy(pred, label) Tensor(0.4328984, device=xpux:0) >>> F.nn.binary_cross_entropy(pred, label, reduction="none") Tensor([0.3412 0.4032 0.5544], device=xpux:0)
If the
pred
value has been probabilities, setwith_logits
to False:>>> pred = Tensor([0.9, 0.7, 0.3]) >>> label = Tensor([1., 1., 1.]) >>> F.nn.binary_cross_entropy(pred, label, with_logits=False) Tensor(0.5553361, device=xpux:0) >>> F.nn.binary_cross_entropy(pred, label, with_logits=False, reduction="none") Tensor([0.1054 0.3567 1.204 ], device=xpux:0)