Closed
Description
This may be a stupid question, but I tried using the LiteLLM model provider to use Claude via AWS Bedrock like this:
model = LitellmModel(
model="bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0"
)
I can see authentication is done using the environment variables in the LiteLLM documentation.
What does the api_key
parameter do? Is this only valid for certain inference providers (e.g. direct OpenAI)?