ImplicitQuantileNetworkDistributionLoss#
- class pytorch_forecasting.metrics.distributions.ImplicitQuantileNetworkDistributionLoss(quantiles: List[float] = [0.02, 0.1, 0.25, 0.5, 0.75, 0.9, 0.98], input_size: int | None = 16, hidden_size: int | None = 32, n_loss_samples: int | None = 64)[source]#
Bases:
DistributionLoss
Implicit Quantile Network Distribution Loss.
Based on Probabilistic Time Series Forecasting with Implicit Quantile Networks. A network is used to directly map network outputs to a quantile.
- Parameters:
prediction_length (int) – maximum prediction length.
quantiles (List[float], optional) – default quantiles to output. Defaults to [0.02, 0.1, 0.25, 0.5, 0.75, 0.9, 0.98].
input_size (int, optional) – input size per prediction length. Defaults to 16.
hidden_size (int, optional) – hidden size per prediction length. Defaults to 64.
n_loss_samples (int, optional) – number of quantiles to sample to calculate loss.
Methods
loss
(y_pred, y_actual)Calculate negative likelihood
rescale_parameters
(parameters, target_scale, ...)Rescale normalized parameters into the scale required for the output.
sample
(y_pred, n_samples)Sample from distribution.
to_prediction
(y_pred[, n_samples])Convert network prediction into a point prediction.
to_quantiles
(y_pred[, quantiles])Convert network prediction into a quantile prediction.
- loss(y_pred: Tensor, y_actual: Tensor) Tensor [source]#
Calculate negative likelihood
- Parameters:
y_pred – network output
y_actual – actual values
- Returns:
metric value on which backpropagation can be applied
- Return type:
torch.Tensor
- rescale_parameters(parameters: Tensor, target_scale: Tensor, encoder: BaseEstimator) Tensor [source]#
Rescale normalized parameters into the scale required for the output.
- Parameters:
parameters (torch.Tensor) – normalized parameters (indexed by last dimension)
target_scale (torch.Tensor) – scale of parameters (n_batch_samples x (center, scale))
encoder (BaseEstimator) – original encoder that normalized the target in the first place
- Returns:
parameters in real/not normalized space
- Return type:
torch.Tensor
- sample(y_pred, n_samples: int) Tensor [source]#
Sample from distribution.
- Parameters:
y_pred – prediction output of network (shape batch_size x n_timesteps x n_paramters)
n_samples (int) – number of samples to draw
- Returns:
tensor with samples (shape batch_size x n_timesteps x n_samples)
- Return type:
torch.Tensor
- to_prediction(y_pred: Tensor, n_samples: int = 100) Tensor [source]#
Convert network prediction into a point prediction.
- Parameters:
y_pred – prediction output of network
- Returns:
mean prediction
- Return type:
torch.Tensor
- to_quantiles(y_pred: Tensor, quantiles: List[float] | None = None) Tensor [source]#
Convert network prediction into a quantile prediction.
- Parameters:
y_pred – prediction output of network
quantiles (List[float], optional) – quantiles for probability range. Defaults to quantiles as as defined in the class initialization.
- Returns:
prediction quantiles (last dimension)
- Return type:
torch.Tensor