Buckets:
Activation functions
Customized activation functions for supporting various models in 🤗 Diffusers.
GELU[[diffusers.models.activations.GELU]]
class diffusers.models.activations.GELUdiffusers.models.activations.GELUint) -- The number of channels in the input.
- dim_out (
int) -- The number of channels in the output. - approximate (
str, optional, defaults to"none") -- If"tanh", use tanh approximation. - bias (
bool, defaults to True) -- Whether to use a bias in the linear layer.0
GELU activation function with tanh approximation support with approximate="tanh".
GEGLU[[diffusers.models.activations.GEGLU]]
class diffusers.models.activations.GEGLUdiffusers.models.activations.GEGLUint) -- The number of channels in the input.
- dim_out (
int) -- The number of channels in the output. - bias (
bool, defaults to True) -- Whether to use a bias in the linear layer.0
A variant of the gated linear unit activation function.
ApproximateGELU[[diffusers.models.activations.ApproximateGELU]]
class diffusers.models.activations.ApproximateGELUdiffusers.models.activations.ApproximateGELUint) -- The number of channels in the input.
- dim_out (
int) -- The number of channels in the output. - bias (
bool, defaults to True) -- Whether to use a bias in the linear layer.0
The approximate form of the Gaussian Error Linear Unit (GELU). For more details, see section 2 of this paper.
SwiGLU[[diffusers.models.activations.SwiGLU]]
class diffusers.models.activations.SwiGLUdiffusers.models.activations.SwiGLUint) -- The number of channels in the input.
- dim_out (
int) -- The number of channels in the output. - bias (
bool, defaults to True) -- Whether to use a bias in the linear layer.0
A variant of the gated linear unit activation function. It's similar to
GEGLU but uses SiLU / Swish instead of GeLU.
FP32SiLU[[diffusers.models.activations.FP32SiLU]]
class diffusers.models.activations.FP32SiLUdiffusers.models.activations.FP32SiLU
SiLU activation function with input upcasted to torch.float32.
LinearActivation[[diffusers.models.activations.LinearActivation]]
class diffusers.models.activations.LinearActivationdiffusers.models.activations.LinearActivation
Xet Storage Details
- Size:
- 5.24 kB
- Xet hash:
- d36a940644acd8d67abade2f917651206b801fccac63b822a54a377a8b95081c
Xet efficiently stores files, intelligently splitting them into unique chunks and accelerating uploads and downloads. More info.