You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When implementing a user defined task I find the response times to be extremely slow even while using the bulk document submission in spacy-llm i.e. "list(nlp.pipe(docs))"
When using this technique I find that each document is submitted one at a time in a queue like structure as opposed to sending a bulk query to Azure and having the entire result returned at once? Is this the expected behavior? I have looked into Azure LLM api's and it does appear that they have bulk submission capabilities built into their system. I would love to be able to update my code in order to speed the processing time up with this implementation.
The text was updated successfully, but these errors were encountered:
When implementing a user defined task I find the response times to be extremely slow even while using the bulk document submission in spacy-llm i.e. "list(nlp.pipe(docs))"
When using this technique I find that each document is submitted one at a time in a queue like structure as opposed to sending a bulk query to Azure and having the entire result returned at once? Is this the expected behavior? I have looked into Azure LLM api's and it does appear that they have bulk submission capabilities built into their system. I would love to be able to update my code in order to speed the processing time up with this implementation.
The text was updated successfully, but these errors were encountered: