Skip to content

This model's maximum context length is 131072 tokens. However, you requested 131134 tokens (122942 in the messages, 8192 in the completion). #621

@jinwater88

Description

@jinwater88

Error details: deepseek request failed for model 'deepseek-chat' (code:

invalid_request_error) (status=400): Error code: 400 - {'error': {'message':

"This model's maximum context length is 131072 tokens. However, you requested

131134 tokens (122942 in the messages, 8192 in the completion). Please reduce

the length of the messages or completion.", 'type': 'invalid_request_error',

'param': None, 'code': 'invalid_request_error'}}

Metadata

Metadata

Assignees

Labels

defectSomething isn't working

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions