- class pytorch_forecasting.data.samplers.TimeSynchronizedBatchSampler(data_source, batch_size: int = 64, shuffle: bool = False, drop_last: bool = False)#
Samples mini-batches randomly but in a time-synchronised manner.
Time-synchornisation means that the time index of the first decoder samples are aligned across the batch. This sampler does not support missing values in the dataset.
data_source (TimeSeriesDataSet) – timeseries dataset.
drop_last (bool) – if to drop last mini-batch from a group if it is smaller than batch_size. Defaults to False.
shuffle (bool) – if to shuffle dataset. Defaults to False.
batch_size (int, optional) – Number of samples in a mini-batch. This is rather the maximum number of samples. Because mini-batches are grouped by prediction time, chances are that there are multiple where batch size will be smaller than the maximum. Defaults to 64.
Construct index of batches from which can be sampled
Create the groups which can be sampled.