| tags: | |
| - kernels | |
| > [!WARNING] | |
| > This repository will soon be deleted as it's now deprecated. Please use [kernels-community/layer-norm](https://huggingface.co/kernels-community/layer-norm). | |
| This CUDA extension implements fused dropout + residual + LayerNorm from the [flash-attention](https://github.com/Dao-AILab/flash-attention/tree/main/csrc/layer_norm) repo. |