Skip to content

Commit 5fbdb45

Browse files
authored
Update Bedrock modules for LangChain v0.3.x (#1487)
Updating LangChain modules for Amazon Bedrock (AWS) in "customizations" section. - Use `langchain_aws` instead of `langchain_community.` - Use `ChatBedrock` instead of `BedrockChat`
1 parent 9686b14 commit 5fbdb45

File tree

2 files changed

+22
-15
lines changed

2 files changed

+22
-15
lines changed

docs/extra/components/choose_evaluvator_llm.md

+8-7
Original file line numberDiff line numberDiff line change
@@ -21,19 +21,20 @@
2121
config = {
2222
"credentials_profile_name": "your-profile-name", # E.g "default"
2323
"region_name": "your-region-name", # E.g. "us-east-1"
24-
"model_id": "your-model-id", # E.g "anthropic.claude-v2"
25-
"model_kwargs": {"temperature": 0.4},
24+
"llm": "your-llm-model-id", # E.g "anthropic.claude-3-5-sonnet-20240620-v1:0"
25+
"embeddings": "your-embedding-model-id", # E.g "amazon.titan-embed-text-v2:0"
26+
"temperature": 0.4,
2627
}
2728
```
2829
define you LLMs
2930
```python
30-
from langchain_aws.chat_models import BedrockChat
31+
from langchain_aws import ChatBedrockConverse
3132
from ragas.llms import LangchainLLMWrapper
32-
evaluator_llm = LangchainLLMWrapper(BedrockChat(
33+
evaluator_llm = LangchainLLMWrapper(ChatBedrockConverse(
3334
credentials_profile_name=config["credentials_profile_name"],
3435
region_name=config["region_name"],
35-
endpoint_url=f"https://bedrock-runtime.{config['region_name']}.amazonaws.com",
36-
model_id=config["model_id"],
37-
model_kwargs=config["model_kwargs"],
36+
base_url=f"https://bedrock-runtime.{config['region_name']}.amazonaws.com",
37+
model=config["llm"],
38+
temperature=config["temperature"],
3839
))
3940
```

docs/howtos/customizations/customize_models.md

+14-8
Original file line numberDiff line numberDiff line change
@@ -96,31 +96,37 @@ Yay! Now are you ready to use ragas with Google VertexAI endpoints
9696

9797
### AWS Bedrock
9898

99+
```bash
100+
pip install langchain_aws
101+
```
102+
99103
```python
100-
from langchain_community.chat_models import BedrockChat
101-
from langchain_community.embeddings import BedrockEmbeddings
104+
from langchain_aws import ChatBedrockConverse
105+
from langchain_aws import BedrockEmbeddings
102106
from ragas.llms import LangchainLLMWrapper
103107
from ragas.embeddings import LangchainEmbeddingsWrapper
104108

105109
config = {
106110
"credentials_profile_name": "your-profile-name", # E.g "default"
107111
"region_name": "your-region-name", # E.g. "us-east-1"
108-
"model_id": "your-model-id", # E.g "anthropic.claude-v2"
109-
"model_kwargs": {"temperature": 0.4},
112+
"llm": "your-llm-model-id", # E.g "anthropic.claude-3-5-sonnet-20240620-v1:0"
113+
"embeddings": "your-embedding-model-id", # E.g "amazon.titan-embed-text-v2:0"
114+
"temperature": 0.4,
110115
}
111116

112-
bedrock_llm = BedrockChat(
117+
bedrock_llm = ChatBedrockConverse(
113118
credentials_profile_name=config["credentials_profile_name"],
114119
region_name=config["region_name"],
115-
endpoint_url=f"https://bedrock-runtime.{config['region_name']}.amazonaws.com",
116-
model_id=config["model_id"],
117-
model_kwargs=config["model_kwargs"],
120+
base_url=f"https://bedrock-runtime.{config['region_name']}.amazonaws.com",
121+
model=config["llm"],
122+
temperature=config["temperature"],
118123
)
119124

120125
# init the embeddings
121126
bedrock_embeddings = BedrockEmbeddings(
122127
credentials_profile_name=config["credentials_profile_name"],
123128
region_name=config["region_name"],
129+
model_id=config["embeddings"],
124130
)
125131

126132
bedrock_llm = LangchainLLMWrapper(bedrock_llm)

0 commit comments

Comments
 (0)