Getting started
Tutorials
Data
Models
Metrics
FAQ
Contribute
API
Release Notes
GitHub
data
encoders
_clamp_zero
_identity
_plus_one
EncoderNormalizer
GroupNormalizer
MultiNormalizer
NaNLabelEncoder
TorchNormalizer
examples
_get_data_by_filename
generate_ar_data
get_stallion_data
timeseries
_find_end_indices
check_for_nonfinite
TimeSeriesDataSet
TimeSynchronizedBatchSampler
models
base_model
_concatenate_output
_torch_cat_na
AutoRegressiveBaseModel
AutoRegressiveBaseModelWithCovariates
BaseModel
BaseModelWithCovariates
baseline
Baseline
basic_rnn
LSTMModel
deepar
DeepAR
mlp
DecoderMLP
submodules
FullyConnectedModule
nbeats
NBeats
sub_modules
linear
linspace
NBEATSBlock
NBEATSGenericBlock
NBEATSSeasonalBlock
NBEATSTrendBlock
nn
embeddings
MultiEmbedding
TimeDistributedEmbeddingBag
rnn
get_rnn
GRU
LSTM
RNN
rnn
RecurrentNetwork
temporal_fusion_transformer
TemporalFusionTransformer
sub_modules
AddNorm
GateAddNorm
GatedLinearUnit
GatedResidualNetwork
InterpretableMultiHeadAttention
PositionalEncoder
ResampleNorm
ScaledDotProductAttention
TimeDistributed
TimeDistributedInterpolation
VariableSelectionNetwork
tuning
optimize_hyperparameters
MetricsCallback
metrics
AggregationMetric
BetaDistributionLoss
CompositeMetric
CrossEntropy
DistributionLoss
LogNormalDistributionLoss
MAE
MAPE
MASE
Metric
MultiHorizonMetric
MultiLoss
NegativeBinomialDistributionLoss
NormalDistributionLoss
PoissonLoss
QuantileLoss
RMSE
SMAPE
optim
Ranger
utils
apply_to_list
autocorrelation
create_mask
detach
get_embedding_size
groupby_apply
integer_histogram
move_to_device
next_fast_len
padded_stack
profile
to_list
unpack_sequence
unsqueeze_like
OutputMixIn
_identity
¶
pytorch_forecasting.data.encoders.
_identity
(
x
)
[source]
¶
previous
_clamp_zero
next
_plus_one