FlashAttention-2 in Triton for sliding window attention (fwd + bwd pass)
-
Updated
Mar 6, 2025 - Python
FlashAttention-2 in Triton for sliding window attention (fwd + bwd pass)
200 lines Flash Attention (only forward pass) in CUDA.
Add a description, image, and links to the flashattention topic page so that developers can more easily learn about it.
To associate your repository with the flashattention topic, visit your repo's landing page and select "manage topics."