From 4bc0491752b6d891c5bb33a3d18c5da7334fc6df Mon Sep 17 00:00:00 2001 From: Driss Guessous Date: Tue, 24 Jan 2023 22:59:47 +0000 Subject: [PATCH] Add USE_FLASH_ATTENTION flag to setup.py (#92903) # Summary Adds documentation to setup.py for USE_FLASH_ATTENTION=0 disabling to decrease build times. Pull Request resolved: https://github.com/pytorch/pytorch/pull/92903 Approved by: https://github.com/cpuhrsch, https://github.com/bdhirsh --- setup.py | 3 +++ 1 file changed, 3 insertions(+) diff --git a/setup.py b/setup.py index e428dc874f0f7..4fafbf59261c4 100644 --- a/setup.py +++ b/setup.py @@ -95,6 +95,9 @@ # USE_FFMPEG # enables use of ffmpeg for additional operators # +# USE_FLASH_ATTENTION=0 +# disables building flash attention for scaled dot product attention +# # USE_LEVELDB # enables use of LevelDB for storage #