Error details: deepseek request failed for model 'deepseek-chat' (code:
invalid_request_error) (status=400): Error code: 400 - {'error': {'message':
"This model's maximum context length is 131072 tokens. However, you requested
131134 tokens (122942 in the messages, 8192 in the completion). Please reduce
the length of the messages or completion.", 'type': 'invalid_request_error',
'param': None, 'code': 'invalid_request_error'}}