Buckets:

rtrm's picture
|
download
raw
3.09 kB

Activation functions

Customized activation functions for supporting various models in 🤗 Diffusers.

GELU[[diffusers.models.activations.GELU]]

diffusers.models.activations.GELU[[diffusers.models.activations.GELU]]

Source

GELU activation function with tanh approximation support with approximate="tanh".

Parameters:

dim_in (int) : The number of channels in the input.

dim_out (int) : The number of channels in the output.

approximate (str, optional, defaults to "none") : If "tanh", use tanh approximation.

bias (bool, defaults to True) : Whether to use a bias in the linear layer.

GEGLU[[diffusers.models.activations.GEGLU]]

diffusers.models.activations.GEGLU[[diffusers.models.activations.GEGLU]]

Source

A variant of the gated linear unit activation function.

Parameters:

dim_in (int) : The number of channels in the input.

dim_out (int) : The number of channels in the output.

bias (bool, defaults to True) : Whether to use a bias in the linear layer.

ApproximateGELU[[diffusers.models.activations.ApproximateGELU]]

diffusers.models.activations.ApproximateGELU[[diffusers.models.activations.ApproximateGELU]]

Source

The approximate form of the Gaussian Error Linear Unit (GELU). For more details, see section 2 of this paper.

Parameters:

dim_in (int) : The number of channels in the input.

dim_out (int) : The number of channels in the output.

bias (bool, defaults to True) : Whether to use a bias in the linear layer.

SwiGLU[[diffusers.models.activations.SwiGLU]]

diffusers.models.activations.SwiGLU[[diffusers.models.activations.SwiGLU]]

Source

A variant of the gated linear unit activation function. It's similar to GEGLU but uses SiLU / Swish instead of GeLU.

Parameters:

dim_in (int) : The number of channels in the input.

dim_out (int) : The number of channels in the output.

bias (bool, defaults to True) : Whether to use a bias in the linear layer.

FP32SiLU[[diffusers.models.activations.FP32SiLU]]

diffusers.models.activations.FP32SiLU[[diffusers.models.activations.FP32SiLU]]

Source

SiLU activation function with input upcasted to torch.float32.

LinearActivation[[diffusers.models.activations.LinearActivation]]

diffusers.models.activations.LinearActivation[[diffusers.models.activations.LinearActivation]]

Source

Xet Storage Details

Size:
3.09 kB
·
Xet hash:
0db5585e9642461dc2720107285ab85e4be4e0c41bca96e4548ce2e97d2051c3

Xet efficiently stores files, intelligently splitting them into unique chunks and accelerating uploads and downloads. More info.