-
Notifications
You must be signed in to change notification settings - Fork 40
Description
I could try to fix it by myself and add it as a PR, but I can't seem to find the problem.
But I use a litellm llm proxy to control my requests. Which I have added as an OpenAICompatible llm.
And this utility works perfect for small amounts of cells.
But it looks to be doing something in excel which is extremely slow.
If I add a prompt(a2,"Tell me the value given") then it works perfectly on one cell.
But if I copy that over 25.000 cells then it just stays on #N/B status.
My litellm proxy does not receive any requests as well. (So it is not my llm slowness)
If I copy it to a 100 cells then it stays a minute on #N/B and then it starts filling the cells.
To me it looks like it is something on the excel side, but I can't seem to find it (as I have no experience profiling / debugging excel addins).
Is there any info you could give where I could look for it?