-
Notifications
You must be signed in to change notification settings - Fork 382
Pull requests: flashinfer-ai/flashinfer
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
feat(aot): add nvshmem module for aot compilation
#1261
opened Jul 15, 2025 by
EmilienM
Loading…
3 of 5 tasks
refactor: separate SM100 and legacy TRT-LLM comm modules
#1259
opened Jul 15, 2025 by
EmilienM
Loading…
3 of 5 tasks
feat: update scale factor interface for trtllm-gen mla kernels.
#1248
opened Jul 14, 2025 by
yyihuang
Loading…
5 tasks done
bugfix: fix fp32 acc threshold for qk using math::inf according to dtype by AIDC-AI
#1247
opened Jul 14, 2025 by
yongchaoding
Loading…
5 tasks done
Add the keyword "template" to member template specialization appears after
.
or ->
in a post-fix expression which is a requirement in C++ standard
#1246
opened Jul 14, 2025 by
tomflinda
Loading…
Add trtllm-gen attention mha kernel with FP8 Q/K/V and FP8 output
#1242
opened Jul 11, 2025 by
weireweire
Loading…
feat: Restore convenience
FLASHINFER_ENABLE_AOT
option
#1235
opened Jul 8, 2025 by
mgorny
Loading…
3 of 5 tasks
[Feature] Support batch prefill for POD Attention
#1231
opened Jul 8, 2025 by
Edenzzzz
Loading…
6 tasks
Feature/sm100 low latency nvfp4 kernels
priority: high
#1214
opened Jul 4, 2025 by
azhurkevich
Loading…
1 of 5 tasks
Use flashinfer
softmax
in top_k_top_p_sampling_from_logits
#1171
opened Jun 24, 2025 by
lgeiger
Loading…
5 tasks done
Port AllGather/ReduceScatter from TensorRT-LLM
#1145
opened Jun 15, 2025 by
wenscarl
Loading…
5 tasks
[WIP] refactor: unifying return status of different backend implementation
#1141
opened Jun 12, 2025 by
yzh119
Loading…
5 tasks
Previous Next
ProTip!
Find all pull requests that aren't related to any open issues with -linked:issue.