Skip to content

Conversation

@massi-ang
Copy link
Contributor

Description:
Added support for Cohere command model via Bedrock.
With this change it is now possible to use the cohere.command-text-v14 model via Bedrock API.

About Streaming: Cohere model outputs 2 additional chunks at the end of the text being generated via streaming: a chunk containing the text <EOS_TOKEN>, and a chunk indicating the end of the stream. In this implementation I chose to ignore both chunks. An alternative solution could be to replace <EOS_TOKEN> with \n

Tests: manually tested that the new model work with both llm.generate() and llm.stream().
Tested with temperature, p and stop parameters.

Issue: #11181

Dependencies: No new dependencies

Tag maintainer: @baskaryan

Twitter handle: mangelino

@vercel
Copy link

vercel bot commented Sep 29, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchain ✅ Ready (Inspect) Visit Preview 💬 Add feedback Oct 4, 2023 0:47am

Copy link
Contributor

@hwchase17 hwchase17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks solid to me!

@hwchase17 hwchase17 added the lgtm label Sep 29, 2023
@demchuk-alex
Copy link

Thanks a lot, @massi-ang I was going to add this fix as well but found that somebody did it already. I've got a project that depends on Cohere on a Bedrock so I'd much appreciate it if it's merged as soon as possible :) I'm using a crutch of inheritance and hacking a few methods :) Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants