Skip to content

fix: added support for google vertexAI #252

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Nov 3, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 13 additions & 4 deletions src/ragas/llms/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@
import typing as t
from abc import ABC, abstractmethod

from langchain.chat_models import AzureChatOpenAI, ChatOpenAI, BedrockChat
from langchain.chat_models import AzureChatOpenAI, BedrockChat, ChatOpenAI, ChatVertexAI
from langchain.chat_models.base import BaseChatModel
from langchain.llms import AzureOpenAI, OpenAI, Bedrock
from langchain.llms import AzureOpenAI, Bedrock, OpenAI, VertexAI
from langchain.llms.base import BaseLLM
from langchain.schema import LLMResult

Expand All @@ -20,13 +20,22 @@
def isOpenAI(llm: BaseLLM | BaseChatModel) -> bool:
return isinstance(llm, OpenAI) or isinstance(llm, ChatOpenAI)


def isBedrock(llm: BaseLLM | BaseChatModel) -> bool:
return isinstance(llm, Bedrock) or isinstance(llm, BedrockChat)


# have to specify it twice for runtime and static checks
MULTIPLE_COMPLETION_SUPPORTED = [OpenAI, ChatOpenAI, AzureOpenAI, AzureChatOpenAI]
MULTIPLE_COMPLETION_SUPPORTED = [
OpenAI,
ChatOpenAI,
AzureOpenAI,
AzureChatOpenAI,
ChatVertexAI,
VertexAI,
]
MultipleCompletionSupportedLLM = t.Union[
OpenAI, ChatOpenAI, AzureOpenAI, AzureChatOpenAI
OpenAI, ChatOpenAI, AzureOpenAI, AzureChatOpenAI, ChatVertexAI, VertexAI
]


Expand Down