You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey @imjwang! Great to see you diving into new challenges. Hope all's been well since our last coding adventure. 🚀
Based on the context provided, it seems like you're encountering an issue where the GoogleGenerativeAIEmbeddings class in the langchainjs framework is failing silently when trying to embed more than 100 documents at once.
The embedDocuments method in the GoogleGenerativeAIEmbeddings class is responsible for embedding documents. This method takes an array of documents as input and returns a promise that resolves to a 2D array of embeddings for each document. It calls the _embedDocumentsContent method for each document in the array.
However, the exact limit on the number of documents that can be embedded at once is not specified in the langchainjs repository. It's possible that this limit is imposed by the underlying Google Generative AI API, or it could be a limitation of the current implementation of the embedDocuments method.
To help us better understand the issue, could you provide more details about the error you're encountering? Specifically, it would be helpful to know:
The exact error message you're seeing, if any.
The number of documents you're trying to embed when the error occurs.
Any relevant code snippets that could help us reproduce the issue.
This information will help us identify whether this is a bug in the langchainjs framework, a limitation of the Google Generative AI API, or something else entirely. Once we have a better understanding of the issue, we can suggest appropriate next steps.
For more information about the GoogleGenerativeAIEmbeddings class and its methods, you can refer to the langchainjs repository.
I was trying to use GoogleGenerativeAIEmbeddings with SupabaseVectorStore and it seems like silently fails when trying to add more than 100 documents.
see #4279
The text was updated successfully, but these errors were encountered: