TimeSynchronizedBatchSampler

class pytorch_forecasting.data.timeseries.TimeSynchronizedBatchSampler(data_source: pytorch_forecasting.data.timeseries.TimeSeriesDataSet, batch_size: int = 64, shuffle: bool = False, drop_last: bool = False)[source]

Bases: torch.utils.data.sampler.Sampler

Samples mini-batches randomly but in a time-synchronised manner.

Time-synchornisation means that the time index of the first decoder samples are aligned across the batch. This sampler does not support missing values in the dataset.

Initialize TimeSynchronizedBatchSampler.

Parameters
  • data_source (TimeSeriesDataSet) – timeseries dataset.

  • drop_last (bool) – if to drop last mini-batch from a group if it is smaller than batch_size. Defaults to False.

  • shuffle (bool) – if to shuffle dataset. Defaults to False.

  • batch_size (int, optional) – Number of samples in a mini-batch. This is rather the maximum number of samples. Because mini-batches are grouped by prediction time, chances are that there are multiple where batch size will be smaller than the maximum. Defaults to 64.

Inherited-members

Methods

construct_batch_groups()

Construct index of batches from which can be sampled

construct_batch_groups()[source]

Construct index of batches from which can be sampled