Skip to content

Torch was not compiled with flash attention warning #1375

Open
@lostmsu

Description

@lostmsu

This is printed when I call functional.scaled_dot_product_attention:

[W914 13:25:36.000000000 sdp_utils.cpp:555] Warning: 1Torch was not compiled with flash attention. (function operator ())

I'm on Windows with TorchSharp-cuda-windows=0.103.0

Metadata

Metadata

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions