megengine.functional.nn.square_loss

square_loss(pred, label, reduction='mean')[source]

Calculates the mean squared error (squared L2 norm) between each element in the pred \(x\) and label \(y\).

The mean squared error can be described as:

\[\ell(x, y) = mean\left( L \right)\]

where

\[L = \{l_1,\dots,l_N\}, \quad l_n = \left( x_n - y_n \right)^2,\]

\(x\) and \(y\) are tensors of arbitrary shapes with a total of \(N\) elements each. \(N\) is the batch size.

Parameters
  • pred (Tensor) – predicted result from model.

  • label (Tensor) – ground truth to compare.

  • reduction (str) – the reduction to apply to the output: ‘none’ | ‘mean’ | ‘sum’.

Return type

Tensor

Returns

loss value.

Shape:
  • pred: \((N, *)\) where \(*\) means any number of additional dimensions.

  • label: \((N, *)\). Same shape as pred.

Examples

>>> pred = Tensor([3, 3, 3, 3])
>>> label = Tensor([2, 8, 6, 1])
>>> F.nn.square_loss(pred, label)
Tensor(9.75, device=xpux:0)
>>> F.nn.square_loss(pred, label, reduction="none")
Tensor([ 1. 25.  9.  4.], device=xpux:0)
>>> F.nn.square_loss(pred, label, reduction="sum")
Tensor(39.0, device=xpux:0)