vllm.model_executor.layers.fused_moe.activation ¶
MoE activation function enum and utilities.
MoEActivation ¶
Bases: Enum
Activation functions for MoE layers.
Source code in vllm/model_executor/layers/fused_moe/activation.py
custom_op_name property ¶
custom_op_name: str
Maps to the CustomOp name of activations in vllm/model_executor/layers/activation.py.
is_gated property ¶
is_gated: bool
Returns True if activation expects gate*activation(up) pattern.
Gated activations expect input tensor with 2x the output size, where the first half is the gate and second half is the up projection.
from_str classmethod ¶
from_str(s: str) -> MoEActivation
Parse from string for backward compatibility.
Source code in vllm/model_executor/layers/fused_moe/activation.py
without_mul ¶
without_mul() -> MoEActivation
Get the non-gated variant of this activation.
For activations that have a _no_mul variant, returns that variant. For activations without a _no_mul variant (or already _no_mul), returns self.
Source code in vllm/model_executor/layers/fused_moe/activation.py
activation_without_mul ¶
Get the non-gated variant of an activation function.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
activation | str | The activation function name (e.g., "silu", "gelu") | required |
Returns:
| Type | Description |
|---|---|
str | The non-gated activation name (e.g., "silu_no_mul", "gelu_no_mul") |
Source code in vllm/model_executor/layers/fused_moe/activation.py
apply_moe_activation ¶
apply_moe_activation(
activation: MoEActivation, output: Tensor, input: Tensor
) -> Tensor
Apply MoE activation function.