We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
ERROR about flash_attr, can u help to provide version for these old nv card?
out, q, k, v, out_padded, softmax_lse, S_dmask, rng_state = flash_attn_cuda.fwd( ^^^^^^^^^^^^^^^^^^^^ RuntimeError: FlashAttention only supports Ampere GPUs or newer.