Skip to content

Adapters

modelgenerator.adapters.MLPAdapter

Bases: Sequential, TokenAdapter

Multi-layer perceptron (MLP) adapter.

Parameters:

Name Type Description Default
in_features int

Number of features of the input

required
out_features int

Number of features of the output

required
hidden_sizes List[int]

List of the hidden feature dimensions. Defaults to [].

[]
activation_layer Callable[..., Module]

Activation function. Defaults to torch.nn.Tanh.

Tanh
bias bool

Whether to use bias in the linear layer. Defaults to True

True
dropout float

The probability for the dropout layer. Defaults to 0.0

0.0
dropout_in_middle bool

Whether to use dropout in the middle layers. Defaults to True

True

modelgenerator.adapters.LinearAdapter

Bases: MLPAdapter

Simple linear adapter for a 1D embedding

Parameters:

Name Type Description Default
in_features int

Number of input features

required
out_features int

Number of output features

required

modelgenerator.adapters.LinearCLSAdapter

Bases: Module, SequenceAdapter

Simple linear adapter for a 1D embedding

Parameters:

Name Type Description Default
in_features int

Number of input features

required
out_features int

Number of output features

required

modelgenerator.adapters.LinearTransformerAdapter

Bases: Module, SequenceAdapter

Transformer adapter

Note: Sopport cls_pooling only.

Parameters:

Name Type Description Default
embed_dim int

Hidden size

required
out_features int

Number of output features

required

modelgenerator.adapters.ConditionalLMAdapter

Bases: Module, ConditionalGenerationAdapter

Conditional sequence adapter

Parameters:

Name Type Description Default
in_features int

Number of input features

required
embed_dim int

Hidden size

required
seq_len int

Sequence length

required