-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: support switching LLM providers and embedding providers through… #181
Conversation
… configuration file fix typecheck
All contributors have signed the CLA ✍️ ✅ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 Looks good to me! Reviewed everything up to 0f70eed in 30 seconds
More details
- Looked at
133
lines of code in3
files - Skipped
0
files when reviewing. - Skipped posting
4
drafted comments based on config settings.
1. server/graph_service/zep_graphiti.py:104
- Draft comment:
Ensureembedder.config.embedding_model
is set regardless ofsettings.embedding_model_name
to avoid unexpected behavior. - Reason this comment was not posted:
Confidence changes required:80%
The code inzep_graphiti.py
has a potential issue with theembedder
configuration. Theembedder.config.embedding_model
is set only ifsettings.embedding_model_name
is not None, but this should be set regardless to ensure the embedder is properly configured. This could lead to unexpected behavior if the model name is not set.
2. server/graph_service/zep_graphiti.py:75
- Draft comment:
The logic for setting up the LLM client and embedder is repeated. Consider extracting this logic into separate functions to adhere to the DRY principle. - Reason this comment was not posted:
Confidence changes required:80%
The code inzep_graphiti.py
is not following the DRY principle. The logic for setting up the LLM client and embedder is repeated in theget_graphiti
function. This logic should be extracted into separate functions to avoid repetition.
3. server/graph_service/zep_graphiti.py:4
- Draft comment:
Ensure that no sensitive data is logged in the application. Review all logging statements for potential exposure of sensitive information. - Reason this comment was not posted:
Comment did not seem useful.
4. server/graph_service/config.py:1
- Draft comment:
Add a copyright notice at the top of this file. - Reason this comment was not posted:
Comment was on unchanged code.
Workflow ID: wflow_hgydPG72fFhawC6D
You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet
mode, and more.
I have read the CLA Document and I hereby sign the CLA |
Important
Adds support for switching LLM and embedding providers, including Groq, Anthropic, and Voyage, with corresponding configuration and documentation updates.
openai
,groq
,anthropic
) and embedding providers (openai
,voyage
) inzep_graphiti.py
.get_graphiti()
to initializeLLMClient
andEmbedderClient
based onSettings
.llm_provider
andembedding_provider
fields toSettings
inconfig.py
.groq
,anthropic
, andvoyage
inSettings
.README.md
to include information about new LLM and embedding providers.This description was created by for 0f70eed. It will automatically update as commits are pushed.