We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 2f4216b commit 7afcbc8Copy full SHA for 7afcbc8
README.md
@@ -16,6 +16,8 @@ High-performance Diffusion Transformer (DiT) implementation from scratch using C
16
- Block-level parallelism for multi-head attention
17
- Memory coalescing for Q, K, V matrix operations
18
19
+and more!
20
+
21
## Usage
22
23
```python
0 commit comments