sub_modules#

Implementation of nn.Modules for temporal fusion transformer.

Classes

AddNorm(input_size[, skip_size, trainable_add])

Initializes internal Module state, shared by both nn.Module and ScriptModule.

GateAddNorm(input_size[, hidden_size, ...])

Initializes internal Module state, shared by both nn.Module and ScriptModule.

GatedLinearUnit(input_size[, hidden_size, ...])

Gated Linear Unit

GatedResidualNetwork(input_size, ...[, ...])

Initializes internal Module state, shared by both nn.Module and ScriptModule.

InterpretableMultiHeadAttention(n_head, d_model)

Initializes internal Module state, shared by both nn.Module and ScriptModule.

PositionalEncoder(d_model[, max_seq_len])

Initializes internal Module state, shared by both nn.Module and ScriptModule.

ResampleNorm(input_size[, output_size, ...])

Initializes internal Module state, shared by both nn.Module and ScriptModule.

ScaledDotProductAttention([dropout, scale])

Initializes internal Module state, shared by both nn.Module and ScriptModule.

TimeDistributed(module[, batch_first])

Initializes internal Module state, shared by both nn.Module and ScriptModule.

TimeDistributedInterpolation(output_size[, ...])

Initializes internal Module state, shared by both nn.Module and ScriptModule.

VariableSelectionNetwork(input_sizes, ...[, ...])

Calcualte weights for num_inputs variables which are each of size input_size