-
-
Notifications
You must be signed in to change notification settings - Fork 5.1k
Open
Labels
SDKbugSomething isn't workingSomething isn't workingdocsIssues related to LiteLLM documentationIssues related to LiteLLM documentationllm translation
Description
What happened?
/models endpoint shows deprecated groq models.
Example curl:
GET https://<litellm url>/v1/models
Response:
...
{
"id": "groq/llama-guard-3-8b",
"object": "model",
"created": 1677610602,
"owned_by": "openai"
},
...
Error when performing chat/completion request:
"message": "litellm.BadRequestError: GroqException - {\"error\":{\"message\":\"The model `llama-guard-3-8b` has been decommissioned and is no longer supported. Please refer to https://console.groq.com/docs/deprecations for a recommendation on which model to use instead.\"
Relevant log output
What part of LiteLLM is this about?
SDK (litellm Python package)
What LiteLLM version are you on ?
v1.80.5-nightly
Twitter / LinkedIn details
No response
Metadata
Metadata
Assignees
Labels
SDKbugSomething isn't workingSomething isn't workingdocsIssues related to LiteLLM documentationIssues related to LiteLLM documentationllm translation