Skip to content

Commit 8170916

Browse files
authored
Add pip install
1 parent cb11b4d commit 8170916

File tree

1 file changed

+6
-0
lines changed

1 file changed

+6
-0
lines changed

README.md

+6
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,12 @@ Original Triton code: [https://triton-lang.org/main/getting-started/tutorials/06
99

1010
See the original thread: [https://github.com/Dao-AILab/flash-attention/issues/352](https://github.com/Dao-AILab/flash-attention/issues/352)
1111

12+
## Quick Install
13+
Create a Python environment (>=3.8) and install through pip:
14+
```
15+
pip install flashattention2-custom-mask
16+
```
17+
1218
## Example Setup
1319
The relevant libraries needed to use the custom-mask FlashAttention2 kernel are below:
1420
```

0 commit comments

Comments
 (0)