We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent cb11b4d commit 8170916Copy full SHA for 8170916
README.md
@@ -9,6 +9,12 @@ Original Triton code: [https://triton-lang.org/main/getting-started/tutorials/06
9
10
See the original thread: [https://github.com/Dao-AILab/flash-attention/issues/352](https://github.com/Dao-AILab/flash-attention/issues/352)
11
12
+## Quick Install
13
+Create a Python environment (>=3.8) and install through pip:
14
+```
15
+pip install flashattention2-custom-mask
16
17
+
18
## Example Setup
19
The relevant libraries needed to use the custom-mask FlashAttention2 kernel are below:
20
```
0 commit comments