Improve performance with large number of queued prompts#8176
Improve performance with large number of queued prompts#8176comfyanonymous merged 3 commits intoComfy-Org:masterfrom
Conversation
|
Nice work looking into this with benchmarks to show. I had something similar opened months ago with zero activity but I'll go ahead and close that since I think your implementation makes more sense. |
|
This is great. Testing Network & UI Performance
With this PR:
Without this PR:
I can do some other tests later with more features that interact with the prompt queue. |
Testing Queue Operations/Features from UI
test-queue-pr-queue-button-features.mp4Everything seems to work the same as before, with better performance in the case where we queue a large batch at once. |
|
This PR Comfy-Org/ComfyUI#8176 in the ComfyUI repository broke Visionatrix, specifically the dev version. These small changes are similar to parts of that PR - they fix it. Signed-off-by: bigcat88 <bigcat88@icloud.com>
|
The same issue still exists when using the comfyui desktop. Performance goes down rapidly when queued more than 100 |
|
Which desktop version? The performance still scales with queue size just less so. There may also be other bottlenecks besides copying queue items. |
I am using nightly version Comfyui desktop updated on 4th August and ComfyUI v0.3.48, when queue tasks over 100, it starts lagging and almost freezes when it goes to 200-300 |
|
Just to confirm: this did not happen before but after updating recently it did start happening? |
It actually started happening after one update in mid of May. I used to queue 200-300 tasks and leave them running overnight without issue, but after that update, it started lagging and then eventually froze after queuing more than 100 items. |
|
I'm not aware of any changes from that period that could cause that. It is also influenced by the number of items in your input folder, the size of your workflows, the number of custom nodes you have, and in particular the number of custom nodes that have long file lists as inputs. |
@mikaizhao-hue See if this extenstion will work for you. It has different approach to internal queue storage and from my tests so far can take large amount of queue items without hiccups - among other features. |
Thank you, will give it a try |
|
I just updated from ComfyUI 0.3.10-40 / Frontend 1.6.18 to ComfyUI 0.3.50 / Frontend 1.23.4 and now queing up is really really really slow when I try to que thousands in advance. It starts to slow down after 300 is in the queue. In my old version it will also be slow but not if I just opened the server and the browser, where it can que thousands really fast. Now it is extremely slow when queing, even if I just opened it. Its a shame because aside from this issue, the new version is better in every way but sadly it is very difficult to use when queing up before I leave the PC now. |
Thanks a lot for the suggestion but unfortunately this doesnt seem really help solve the loading speed issue, it is still as slow when queueing. I already use some other custom nodes for queue saving and loading. Does have some cool features but not to solve the particular problem the new version I updated fo introduced. |
|
@kicapanmanis07-ux I wonder if something else is at play here then. Since I can easilly queue thousands of jobs with no degradation in queuing performance. Tested on the latest ComfyUI. I suggest turning off extensions and putting them back one by one to see which one might be causing performance issues. Extensions especially around queuing and generating alterations during queuing would be most likely offenders. queuing.mp4 |
Thanks! This is working wonders for me! |
Mitigates Comfy-Org/ComfyUI_frontend#2435
ComfyUI's web app exhibits significant lag when a large number of prompts are queued. I verified what was found in the linked thread, pinpointing the slowdown to the specific call
api.getQueue()as the number of prompts grows. Every time a prompt is submitted, the front-end synchronizes the entire queue state by making this call.Profiling the back-end, I found most of the time is spent executing
copy.deepcopyinside PromptQueue while a mutex is held.This PR adds a new version of PromptQueue's
get_current_queuemethod that performscopy.copyinstead ofcopy.deepcopy. Allowing read-only shared access is safe as long asPromptQueuetreats its queue items as immutable, which it does by returning deep copies in its other methods. I named the new versionget_current_queue_volatileto remind callers they must only read and release these values.Using the updated method allows Comfy to respond much faster to requests to the
/queueroute. The web app remains usable on my machine with several hundred queue items. I enqueued 300 prompts for these benchmarks.Notice how the whole

deepcopyblock is gone, and you can actually see Comfy working on other stuff:Caching the JSON representations may be another 80/20 fix. I could look into that in the future. A more complex but optimized solution would be to rewrite state sync so the front end only requests a diff with updates to the queue instead of a full sync. Still, this quick fix helps a lot.