Buckets:
Activation functions
Customized activation functions for supporting various models in 🤗 Diffusers.
GELU[[diffusers.models.activations.GELU]]
diffusers.models.activations.GELU[[diffusers.models.activations.GELU]]
GELU activation function with tanh approximation support with approximate="tanh".
Parameters:
dim_in (int) : The number of channels in the input.
dim_out (int) : The number of channels in the output.
approximate (str, optional, defaults to "none") : If "tanh", use tanh approximation.
bias (bool, defaults to True) : Whether to use a bias in the linear layer.
GEGLU[[diffusers.models.activations.GEGLU]]
diffusers.models.activations.GEGLU[[diffusers.models.activations.GEGLU]]
A variant of the gated linear unit activation function.
Parameters:
dim_in (int) : The number of channels in the input.
dim_out (int) : The number of channels in the output.
bias (bool, defaults to True) : Whether to use a bias in the linear layer.
ApproximateGELU[[diffusers.models.activations.ApproximateGELU]]
diffusers.models.activations.ApproximateGELU[[diffusers.models.activations.ApproximateGELU]]
The approximate form of the Gaussian Error Linear Unit (GELU). For more details, see section 2 of this paper.
Parameters:
dim_in (int) : The number of channels in the input.
dim_out (int) : The number of channels in the output.
bias (bool, defaults to True) : Whether to use a bias in the linear layer.
SwiGLU[[diffusers.models.activations.SwiGLU]]
diffusers.models.activations.SwiGLU[[diffusers.models.activations.SwiGLU]]
A variant of the gated linear unit activation function. It's similar to
GEGLU but uses SiLU / Swish instead of GeLU.
Parameters:
dim_in (int) : The number of channels in the input.
dim_out (int) : The number of channels in the output.
bias (bool, defaults to True) : Whether to use a bias in the linear layer.
FP32SiLU[[diffusers.models.activations.FP32SiLU]]
diffusers.models.activations.FP32SiLU[[diffusers.models.activations.FP32SiLU]]
SiLU activation function with input upcasted to torch.float32.
LinearActivation[[diffusers.models.activations.LinearActivation]]
diffusers.models.activations.LinearActivation[[diffusers.models.activations.LinearActivation]]
Xet Storage Details
- Size:
- 3.09 kB
- Xet hash:
- 0db5585e9642461dc2720107285ab85e4be4e0c41bca96e4548ce2e97d2051c3
Xet efficiently stores files, intelligently splitting them into unique chunks and accelerating uploads and downloads. More info.