Skip to content

Commit

Permalink
Add support for Llama2, Palm, Cohere, Anthropic, Replicate, Azure Mod…
Browse files Browse the repository at this point in the history
…els - using litellm (#13)

This PR adds support for 50+ models with a standard I/O interface using:
https://github.com/BerriAI/litellm/

`ChatLiteLLM()` is integrated into langchain and allows you to call all
models using the `ChatOpenAI` I/O interface
https://python.langchain.com/docs/integrations/chat/litellm

Here's an example of how to use ChatLiteLLM()
```python
ChatLiteLLM(model="gpt-3.5-turbo")
ChatLiteLLM(model="claude-2", temperature=0.3)
ChatLiteLLM(model="command-nightly")
ChatLiteLLM(model="replicate/llama-2-70b-chat:2c1608e18606fad2812020dc541930f2d0495ce32eee50074220b87300bc16e1")

```

---------

Co-authored-by: fynnfluegge <fynnfluegge@gmx.de>
  • Loading branch information
ishaan-jaff and fynnfluegge authored Sep 28, 2023
1 parent 640d4be commit 48e6910
Show file tree
Hide file tree
Showing 3 changed files with 546 additions and 224 deletions.
4 changes: 2 additions & 2 deletions doc_comments_ai/llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@

import inquirer
from langchain import LLMChain, PromptTemplate
from langchain.chat_models import ChatOpenAI
from langchain.chat_models import ChatLiteLLM
from langchain.llms import LlamaCpp

from doc_comments_ai import utils
Expand Down Expand Up @@ -33,7 +33,7 @@ def __init__(
verbose=False,
)
else:
self.llm = ChatOpenAI(
self.llm = ChatLiteLLM(
temperature=0.9, max_tokens=max_tokens, model=model.value
)
self.template = (
Expand Down
Loading

0 comments on commit 48e6910

Please sign in to comment.