You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。
🥰 需求描述
aws bedrock rolling out cross-region inference for Claude 3.5 and some Llama models. it will have compatibility issues with current lobe.env.
🧐 解决方案
designated the region like "us-west-2" Claude 3.5 Sonnet v2 "anthropic.claude-3-5-sonnet-20241022-v2:0"
or with cross-region inference with "us.anthropic.claude-3-5-sonnet-20241022-v2:0"
the difference in Inference profile ID aka 'model identifier' lead to
"Invocation of model ID anthropic.claude-3-5-sonnet-20241022-v2:0 with on-demand throughput isn’t supported"
error in lobechat
"The provided model identifier is invalid" error in liteLLM Proxy
📝 补充信息
aws cross-region inference page
https://docs.aws.amazon.com/bedrock/latest/userguide/inference-profiles-support.html
https://us-east-1.console.aws.amazon.com/bedrock/home?region=us-east-1#/inference-profiles
The text was updated successfully, but these errors were encountered: