You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ Everyone is welcome to contribute, and we value everybody's contribution. Code c
4
4
5
5
It also helps us if you spread the word! Reference the library in blog posts about the awesome projects it made possible, shout out on Twitter every time it has helped you, or simply ⭐️ the repository to say thank you.
6
6
7
-
However you choose to contribute, please be mindful and respect our [code of conduct](https://github.com/SmallDoges/flash-sparse-attention/blob/main/CODE_OF_CONDUCT.md).
7
+
However you choose to contribute, please be mindful and respect our [code of conduct](https://github.com/flash-algo/flash-sparse-attention/blob/main/CODE_OF_CONDUCT.md).
8
8
9
9
## Ways to contribute
10
10
@@ -16,7 +16,7 @@ There are several ways you can contribute to Flash-DMA:
16
16
* Contribute to the examples, benchmarks, or documentation.
17
17
* Improve CUDA kernel performance.
18
18
19
-
If you don't know where to start, there is a special [Good First Issue](https://github.com/SmallDoges/flash-sparse-attention/contribute) listing. It will give you a list of open issues that are beginner-friendly and help you start contributing to open-source.
19
+
If you don't know where to start, there is a special [Good First Issue](https://github.com/flash-algo/flash-sparse-attention/contribute) listing. It will give you a list of open issues that are beginner-friendly and help you start contributing to open-source.
20
20
21
21
> All contributions are equally valuable to the community. 🥰
22
22
@@ -81,14 +81,14 @@ You will need basic `git` proficiency to contribute to Flash-DMA. You'll need **
81
81
82
82
### Development Setup
83
83
84
-
1. Fork the [repository](https://github.com/SmallDoges/flash-sparse-attention) by clicking on the **Fork** button.
84
+
1. Fork the [repository](https://github.com/flash-algo/flash-sparse-attention) by clicking on the **Fork** button.
85
85
86
86
2. Clone your fork to your local disk, and add the base repository as a remote:
3. Create a new branch to hold your development changes:
@@ -157,7 +157,7 @@ You will need basic `git` proficiency to contribute to Flash-DMA. You'll need **
157
157
158
158
### Tests
159
159
160
-
An extensive test suite is included to test the library behavior and performance. Tests can be found in the [tests](https://github.com/SmallDoges/flash-sparse-attention/tree/main/tests) folder and benchmarks in the [benchmarks](https://github.com/SmallDoges/flash-sparse-attention/tree/main/benchmarks) folder.
160
+
An extensive test suite is included to test the library behavior and performance. Tests can be found in the [tests](https://github.com/flash-algo/flash-sparse-attention/tree/main/tests) folder and benchmarks in the [benchmarks](https://github.com/flash-algo/flash-sparse-attention/tree/main/benchmarks) folder.
161
161
162
162
We use `pytest` for testing. From the root of the repository, run:
163
163
@@ -200,6 +200,6 @@ If you discover a security vulnerability, please send an e-mail to the maintaine
200
200
201
201
## Questions?
202
202
203
-
If you have questions about contributing, feel free to ask in the [GitHub Discussions](https://github.com/SmallDoges/flash-sparse-attention/discussions) or open an issue.
203
+
If you have questions about contributing, feel free to ask in the [GitHub Discussions](https://github.com/flash-algo/flash-sparse-attention/discussions) or open an issue.
204
204
205
205
Thank you for contributing to Flash Sparse Attention! 🚀
0 commit comments