- 
                Notifications
    You must be signed in to change notification settings 
- Fork 162
Add AWS Bedrock support to ChatAnthropic #154
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
req >=0.5.1 is needed to support bedrock: wojtekmach/req#374
| Watching your PR, I saw your selection and use of Mimic. I researched it and was happy enough with it that I moved to it generally. That's here in PR #155. It creates a conflict that should be easily resolved. Also, just skimming your work here, I'm somewhat shocked at how hard and complicated AWS Bedrock makes it. Yeesh! | 
| 
 Oh, nice 👍 
 Yep, it's surprisingly complicated. The Azure OpenAI service looks like a very light wrapper around OpenAI's API structure for the most part, making it much easier to add support for. I'll get back to this PR over the next week or so, but it's mostly there functionality wise. Feel free to leave any feedback on the PR as it is now, I think there's only a few changes needed around error cases & possibly some testing improvements. I added the ability to run the same live test case/code against both anthropic and bedrock. It's convenient since both can be tested the same way, but a bit ugly and I haven't found a way to run a specific live test for only bedrock or anthropic (specifying line numbers to mix test  | 
| 
 Was this marged?? | 
| Hi @stevehodgkiss! Where does this stand currently? Do you feel it's ready? | 
| @stevehodgkiss are there any outstanding changes you need support with to get this across the finish line? Happy to find time if so! | 
# Conflicts: # test/chat_models/chat_anthropic_test.exs
| Hey @brainlid & @doomspork 👋 Sorry about the delay getting back to this. I've updated the branch with main and tested it with an app that was using a version of this adapter already. Some thoughts on implementation: 
 The adapter itself has been working fine, so I'm happy with it functionally. EDIT: Confirmed this works with the new/upgraded Claude 3.5 Sonnet model on bedrock ( | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Amazing work @stevehodgkiss! Adding Bedrock support was no easy task.
I think a refactoring opportunity might come when Bedrock support is added to another model.
| Your code looks clean, clear and thoughtful. ❤️💛💙💜 | 
* 'main' of github.com:brainlid/langchain: Add AWS Bedrock support to ChatAnthropic (#154) Handle functions with no parameters for Google AI (#183) Handle missing token usage fields for Google AI (#184) Handle empty text parts from GoogleAI responses (#181) Support system instructions for Google AI (#182) feat: add OpenAI's new structured output API (#180) Support strict mode for tools (#173) Do not duplicate tool call parameters if they are identical (#174) 🐛 cast tool_calls arguments correctly inside message_deltas (#175)
AWS is quite a bit more involved than the Azure/OpenAI integration. It has a custom streaming protocol wrapping the underlying anthropic messages when streaming, and uses request signing rather than a simple bearer token. There's also a slight difference in the JSON payload -
modelname is in the query string &streamis defined by calling a different action (a different path - InvokeModel vs InvokeModelWithResponseStream). Docs.application/vnd.amazon.eventstreamresponse type when streaming.TODO:
%{"message" => ".."}response, so this falls through to unexpected response. The stream returns empty maps with the error name as the key which need handling indo_process_response. Wondering how to handle these error cases between the stream and non-stream functions.