TweedieLoss#

class pytorch_forecasting.metrics.point.TweedieLoss(reduction='mean', p: float = 1.5, **kwargs)[source]#

Bases: MultiHorizonMetric

Tweedie loss.

Tweedie regression with log-link. It might be useful, e.g., for modeling total loss in insurance, or for any target that might be tweedie-distributed.

The loss will take the exponential of the network output before it is returned as prediction. Target normalizer should therefore have no “reverse” transformation, e.g. for the TimeSeriesDataSet initialization, one could use:

from pytorch_forecasting import TimeSeriesDataSet, EncoderNormalizer

dataset = TimeSeriesDataSet(
    target_normalizer=EncoderNormalizer(transformation=dict(forward=torch.log1p))
)

Note that in this example, the data is log1p-transformed before normalized but not re-transformed. The TweedieLoss applies this “exp”-re-transformation on the network output after it has been de-normalized. The result is the model prediction.

Parameters:
  • p (float, optional) – tweedie variance power which is greater equal 1.0 and smaller 2.0. Close to 2 shifts to Gamma distribution and close to 1 shifts to Poisson distribution. Defaults to 1.5.

  • reduction (str, optional) – How to reduce the loss. Defaults to “mean”.

Methods

loss(y_pred, y_true)

Calculate loss without reduction.

to_prediction(out)

Convert network prediction into a point prediction.

loss(y_pred, y_true)[source]#

Calculate loss without reduction. Override in derived classes

Parameters:
  • y_pred – network output

  • y_actual – actual values

Returns:

loss/metric as a single number for backpropagation

Return type:

torch.Tensor

to_prediction(out: Dict[str, Tensor])[source]#

Convert network prediction into a point prediction.

Parameters:

y_pred – prediction output of network

Returns:

point prediction

Return type:

torch.Tensor