lgatr.layers.mlp.config.MLPConfig
- class lgatr.layers.mlp.config.MLPConfig(mv_channels=None, s_channels=None, dropout_prob=None, activation='gelu', increase_hidden_channels=2, num_hidden_layers=1)[source]
Bases:
object
Geometric MLP configuration.
- Parameters:
activation ({"relu", "sigmoid", "gelu"}) – Which (gated) activation function to use.
increase_hidden_channels (int) – Factor by which to increase the number of hidden channels (both multivectors and scalars). Vanilla transformers use 4, we use 2 for backward compatibility.
num_hidden_layers (int) – Number of hidden layers to create.
- Parameters auto-set by LGATr:
mv_channels (int) – Number of input multivector channels.
s_channels (int) – Number of input scalar channels.
dropout_prob (float or None) – Dropout probability
-
activation:
str
= 'gelu'
-
dropout_prob:
Optional
[float
] = None
-
mv_channels:
Optional
[int
] = None
-
s_channels:
Optional
[int
] = None