Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: support switching LLM providers and embedding providers through… #181

Closed
wants to merge 2 commits into from

Conversation

ArnoChenFx
Copy link
Contributor

@ArnoChenFx ArnoChenFx commented Oct 8, 2024

Important

Adds support for switching LLM and embedding providers, including Groq, Anthropic, and Voyage, with corresponding configuration and documentation updates.

  • Behavior:
    • Adds support for switching LLM providers (openai, groq, anthropic) and embedding providers (openai, voyage) in zep_graphiti.py.
    • Updates get_graphiti() to initialize LLMClient and EmbedderClient based on Settings.
  • Configuration:
    • Adds llm_provider and embedding_provider fields to Settings in config.py.
    • Adds API key fields for groq, anthropic, and voyage in Settings.
  • Documentation:
    • Updates README.md to include information about new LLM and embedding providers.

This description was created by Ellipsis for 0f70eed. It will automatically update as commits are pushed.

Copy link

github-actions bot commented Oct 8, 2024

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 Looks good to me! Reviewed everything up to 0f70eed in 30 seconds

More details
  • Looked at 133 lines of code in 3 files
  • Skipped 0 files when reviewing.
  • Skipped posting 4 drafted comments based on config settings.
1. server/graph_service/zep_graphiti.py:104
  • Draft comment:
    Ensure embedder.config.embedding_model is set regardless of settings.embedding_model_name to avoid unexpected behavior.
  • Reason this comment was not posted:
    Confidence changes required: 80%
    The code in zep_graphiti.py has a potential issue with the embedder configuration. The embedder.config.embedding_model is set only if settings.embedding_model_name is not None, but this should be set regardless to ensure the embedder is properly configured. This could lead to unexpected behavior if the model name is not set.
2. server/graph_service/zep_graphiti.py:75
  • Draft comment:
    The logic for setting up the LLM client and embedder is repeated. Consider extracting this logic into separate functions to adhere to the DRY principle.
  • Reason this comment was not posted:
    Confidence changes required: 80%
    The code in zep_graphiti.py is not following the DRY principle. The logic for setting up the LLM client and embedder is repeated in the get_graphiti function. This logic should be extracted into separate functions to avoid repetition.
3. server/graph_service/zep_graphiti.py:4
  • Draft comment:
    Ensure that no sensitive data is logged in the application. Review all logging statements for potential exposure of sensitive information.
  • Reason this comment was not posted:
    Comment did not seem useful.
4. server/graph_service/config.py:1
  • Draft comment:
    Add a copyright notice at the top of this file.
  • Reason this comment was not posted:
    Comment was on unchanged code.

Workflow ID: wflow_hgydPG72fFhawC6D


You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.

@ArnoChenFx
Copy link
Contributor Author

I have read the CLA Document and I hereby sign the CLA

@ArnoChenFx ArnoChenFx closed this Oct 8, 2024
@ArnoChenFx ArnoChenFx reopened this Oct 8, 2024
@github-actions github-actions bot locked and limited conversation to collaborators Oct 8, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant