Metrics#
Multiple metrics have been implemented to ease adaptation.
In particular, these metrics can be applied to the multihorizon forecasting problem, i.e.
can take tensors that are not only of shape n_samples
but also n_samples x prediction_horizon
or even n_samples x prediction_horizon x n_outputs
, where n_outputs
could be the number
of forecasted quantiles.
Metrics can be easily combined by addition, e.g.
from pytorch_forecasting.metrics import SMAPE, MAE
composite_metric = SMAPE() + 1e4 * MAE()
Such composite metrics are useful when training because they can reduce outliers in other metrics. In the example, SMAPE is mostly optimized, while large outliers in MAE are avoided.
Further, one can modify a loss metric to reduce a mean prediction bias, i.e. ensure that predictions add up. For example:
from pytorch_forecasting.metrics import MAE, AggregationMetric
composite_metric = MAE() + AggregationMetric(metric=MAE())
Here we add to MAE an additional loss. This additional loss is the MAE calculated on the mean predictions and actuals. We can also use other metrics such as SMAPE to ensure aggregated results are unbiased in that metric. One important point to keep in mind is that this metric is calculated accross samples, i.e. it will vary depending on the batch size. In particular, errors tend to average out with increased batch sizes.
Details#
See the API documentation for further details on available metrics:

DistributionLoss base class. 
Base metric class that has basic functions that can handle predicting quantiles and operate in log space. 


Abstract class for defining metric for a multihorizon forecast 
Metric that can be used with muliple metrics. 


Base class for multivariate distribution losses. 

If necessary, convert a torchmetric to a PyTorch Forecasting metric that works with PyTorch Forecasting models. 

Beta distribution loss for unit interval data. 

Implicit Quantile Network Distribution Loss. 

Lognormal loss. 

Multivariate quantile loss based on the article Multivariate Quantile Function Forecaster. 

Multivariate lowrank normal distribution loss. 

Negative binomial loss, e.g. 

Normal distribution loss. 
Cross entropy loss for classification. 

Mean average absolute error. 

Mean absolute percentage. 

Mean absolute scaled error 

Poisson loss for count data. 

Root mean square error 

Symmetric mean absolute percentage. 

Tweedie loss. 

Quantile loss, i.e. a quantile of 