Skip to content

[Bug]: deepseek official api format wrong? #5531

Open
@pickle780

Description

@pickle780

Is there an existing issue for the same bug?

  • I have checked the existing issues.

RAGFlow workspace code commit ID

3b30799

RAGFlow image version

RAGFLOW_IMAGE=infiniflow/ragflow:v0.16.0

Other environment information

wsl
windows10
use docker

Actual behavior

00:41:53 Knowledge graph extraction error:Expecting value: line 1 column 1 (char 0)
00:41:53 Knowledge graph extraction error:ERROR: Error code: 400 - {'error': {'message': 'The last message of deepseek-reasoner must be a user message, or an assistant message with prefix mode on (refer to https://api-docs.deepseek.com/guides/chat_prefix_completion).', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}}

Expected behavior

https://api-docs.deepseek.com/guides/chat_prefix_completion
not sure if its related with this

Steps to reproduce

I created a knowledge database with offical deepseek api then goes wrong with this.
By the way, local model fine with this. This won't happen.

Additional information

2025-03-03 00:42:39 Exception: ERROR: Error code: 400 - {'error': {'message': 'The last message of deepseek-reasoner must be a user message, or an assistant message with prefix mode on (refer to https://api-docs.deepseek.com/guides/chat_prefix_completion).', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}}
2025-03-03 00:42:41 2025-03-03 00:42:41,437 ERROR 28 error extracting graph
2025-03-03 00:42:41 Traceback (most recent call last):
2025-03-03 00:42:41 File "/ragflow/graphrag/light/graph_extractor.py", line 95, in _process_single_content
2025-03-03 00:42:41 final_result = self._chat(hint_prompt, [{"role": "user", "content": "Output:"}], gen_conf)
2025-03-03 00:42:41 File "/ragflow/graphrag/general/extractor.py", line 61, in _chat
2025-03-03 00:42:41 response = self._llm.chat(system, hist, conf)
2025-03-03 00:42:41 File "/ragflow/api/db/services/llm_service.py", line 288, in chat
2025-03-03 00:42:41 txt, used_tokens = self.mdl.chat(system, history, gen_conf)
2025-03-03 00:42:41 File "/ragflow/rag/llm/chat_model.py", line 46, in chat
2025-03-03 00:42:41 response = self.client.chat.completions.create(
2025-03-03 00:42:41 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_utils/_utils.py", line 274, in wrapper
2025-03-03 00:42:41 return func(*args, **kwargs)
2025-03-03 00:42:41 File "/ragflow/.venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 704, in create
2025-03-03 00:42:41 return self._post(
2025-03-03 00:42:41 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1260, in post
2025-03-03 00:42:41 return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
2025-03-03 00:42:41 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 937, in request
2025-03-03 00:42:41 return self._request(
2025-03-03 00:42:41 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1043, in _request
2025-03-03 00:42:41 return self._process_response(
2025-03-03 00:42:41 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1142, in _process_response
2025-03-03 00:42:41 return api_response.parse()
2025-03-03 00:42:41 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_response.py", line 314, in parse
2025-03-03 00:42:41 parsed = self._parse(to=to)
2025-03-03 00:42:41 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_response.py", line 256, in _parse
2025-03-03 00:42:41 data = response.json()
2025-03-03 00:42:41 File "/ragflow/.venv/lib/python3.10/site-packages/httpx/_models.py", line 764, in json
2025-03-03 00:42:41 return jsonlib.loads(self.content, **kwargs)
2025-03-03 00:42:41 File "/usr/lib/python3.10/json/init.py", line 346, in loads
2025-03-03 00:42:41 return _default_decoder.decode(s)
2025-03-03 00:42:41 File "/usr/lib/python3.10/json/decoder.py", line 337, in decode
2025-03-03 00:42:41 obj, end = self.raw_decode(s, idx=_w(s, 0).end())
2025-03-03 00:42:41 File "/usr/lib/python3.10/json/decoder.py", line 355, in raw_decode
2025-03-03 00:42:41 raise JSONDecodeError("Expecting value", s, err.value) from None
2025-03-03 00:42:41 json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
2025-03-03 00:42:49 2025-03-03 00:42:49,616 ERROR 28 error extracting graph
2025-03-03 00:42:49 Traceback (most recent call last):
2025-03-03 00:42:49 File "/ragflow/graphrag/light/graph_extractor.py", line 95, in _process_single_content
2025-03-03 00:42:49 final_result = self._chat(hint_prompt, [{"role": "user", "content": "Output:"}], gen_conf)
2025-03-03 00:42:49 File "/ragflow/graphrag/general/extractor.py", line 61, in _chat
2025-03-03 00:42:49 response = self._llm.chat(system, hist, conf)
2025-03-03 00:42:49 File "/ragflow/api/db/services/llm_service.py", line 288, in chat
2025-03-03 00:42:49 txt, used_tokens = self.mdl.chat(system, history, gen_conf)
2025-03-03 00:42:49 File "/ragflow/rag/llm/chat_model.py", line 46, in chat
2025-03-03 00:42:49 response = self.client.chat.completions.create(
2025-03-03 00:42:49 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_utils/_utils.py", line 274, in wrapper
2025-03-03 00:42:49 return func(*args, **kwargs)
2025-03-03 00:42:49 File "/ragflow/.venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 704, in create
2025-03-03 00:42:49 return self._post(
2025-03-03 00:42:49 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1260, in post
2025-03-03 00:42:49 return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
2025-03-03 00:42:49 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 937, in request
2025-03-03 00:42:49 return self._request(
2025-03-03 00:42:49 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1043, in _request
2025-03-03 00:42:49 return self._process_response(
2025-03-03 00:42:49 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1142, in _process_response
2025-03-03 00:42:49 return api_response.parse()
2025-03-03 00:42:49 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_response.py", line 314, in parse
2025-03-03 00:42:49 parsed = self._parse(to=to)
2025-03-03 00:42:49 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_response.py", line 256, in _parse
2025-03-03 00:42:49 data = response.json()
2025-03-03 00:42:49 File "/ragflow/.venv/lib/python3.10/site-packages/httpx/_models.py", line 764, in json
2025-03-03 00:42:49 return jsonlib.loads(self.content, **kwargs)
2025-03-03 00:42:49 File "/usr/lib/python3.10/json/init.py", line 346, in loads
2025-03-03 00:42:49 return _default_decoder.decode(s)
2025-03-03 00:42:49 File "/usr/lib/python3.10/json/decoder.py", line 337, in decode
2025-03-03 00:42:49 obj, end = self.raw_decode(s, idx=_w(s, 0).end())
2025-03-03 00:42:49 File "/usr/lib/python3.10/json/decoder.py", line 355, in raw_decode
2025-03-03 00:42:49 raise JSONDecodeError("Expecting value", s, err.value) from None
2025-03-03 00:42:49 json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
2025-03-03 00:42:53 2025-03-03 00:42:53,246 ERROR 28 error extracting graph
2025-03-03 00:42:53 Traceback (most recent call last):
2025-03-03 00:42:53 File "/ragflow/graphrag/light/graph_extractor.py", line 95, in _process_single_content
2025-03-03 00:42:53 final_result = self._chat(hint_prompt, [{"role": "user", "content": "Output:"}], gen_conf)
2025-03-03 00:42:53 File "/ragflow/graphrag/general/extractor.py", line 61, in _chat
2025-03-03 00:42:53 response = self._llm.chat(system, hist, conf)
2025-03-03 00:42:53 File "/ragflow/api/db/services/llm_service.py", line 288, in chat
2025-03-03 00:42:53 txt, used_tokens = self.mdl.chat(system, history, gen_conf)
2025-03-03 00:42:53 File "/ragflow/rag/llm/chat_model.py", line 46, in chat
2025-03-03 00:42:53 response = self.client.chat.completions.create(
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_utils/_utils.py", line 274, in wrapper
2025-03-03 00:42:53 return func(*args, **kwargs)
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 704, in create
2025-03-03 00:42:53 return self._post(
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1260, in post
2025-03-03 00:42:53 return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 937, in request
2025-03-03 00:42:53 return self._request(
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1043, in _request
2025-03-03 00:42:53 return self._process_response(
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1142, in _process_response
2025-03-03 00:42:53 return api_response.parse()
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_response.py", line 314, in parse
2025-03-03 00:42:53 parsed = self._parse(to=to)
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_response.py", line 256, in _parse
2025-03-03 00:42:53 data = response.json()
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/httpx/_models.py", line 764, in json
2025-03-03 00:42:53 return jsonlib.loads(self.content, **kwargs)
2025-03-03 00:42:53 File "/usr/lib/python3.10/json/init.py", line 346, in loads
2025-03-03 00:42:53 return _default_decoder.decode(s)
2025-03-03 00:42:53 File "/usr/lib/python3.10/json/decoder.py", line 337, in decode
2025-03-03 00:42:53 obj, end = self.raw_decode(s, idx=_w(s, 0).end())
2025-03-03 00:42:53 File "/usr/lib/python3.10/json/decoder.py", line 355, in raw_decode
2025-03-03 00:42:53 raise JSONDecodeError("Expecting value", s, err.value) from None
2025-03-03 00:42:53 json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
2025-03-03 00:42:53 2025-03-03 00:42:53,385 ERROR 28 error extracting graph
2025-03-03 00:42:53 Traceback (most recent call last):
2025-03-03 00:42:53 File "/ragflow/graphrag/light/graph_extractor.py", line 95, in _process_single_content
2025-03-03 00:42:53 final_result = self._chat(hint_prompt, [{"role": "user", "content": "Output:"}], gen_conf)
2025-03-03 00:42:53 File "/ragflow/graphrag/general/extractor.py", line 61, in _chat
2025-03-03 00:42:53 response = self._llm.chat(system, hist, conf)
2025-03-03 00:42:53 File "/ragflow/api/db/services/llm_service.py", line 288, in chat
2025-03-03 00:42:53 txt, used_tokens = self.mdl.chat(system, history, gen_conf)
2025-03-03 00:42:53 File "/ragflow/rag/llm/chat_model.py", line 46, in chat
2025-03-03 00:42:53 response = self.client.chat.completions.create(
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_utils/_utils.py", line 274, in wrapper
2025-03-03 00:42:53 return func(*args, **kwargs)
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 704, in create
2025-03-03 00:42:53 return self._post(
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1260, in post
2025-03-03 00:42:53 return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 937, in request
2025-03-03 00:42:53 return self._request(
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1043, in _request
2025-03-03 00:42:53 return self._process_response(
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1142, in _process_response
2025-03-03 00:42:53 return api_response.parse()
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_response.py", line 314, in parse
2025-03-03 00:42:53 parsed = self._parse(to=to)
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_response.py", line 256, in _parse
2025-03-03 00:42:53 data = response.json()
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/httpx/_models.py", line 764, in json
2025-03-03 00:42:53 return jsonlib.loads(self.content, **kwargs)
2025-03-03 00:42:53 File "/usr/lib/python3.10/json/init.py", line 346, in loads
2025-03-03 00:42:53 return _default_decoder.decode(s)
2025-03-03 00:42:53 File "/usr/lib/python3.10/json/decoder.py", line 337, in decode
2025-03-03 00:42:53 obj, end = self.raw_decode(s, idx=_w(s, 0).end())
2025-03-03 00:42:53 File "/usr/lib/python3.10/json/decoder.py", line 355, in raw_decode
2025-03-03 00:42:53 raise JSONDecodeError("Expecting value", s, err.value) from None
2025-03-03 00:42:53 json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
2025-03-03 00:42:53 2025-03-03 00:42:53,435 ERROR 28 error extracting graph
2025-03-03 00:42:53 Traceback (most recent call last):
2025-03-03 00:42:53 File "/ragflow/graphrag/light/graph_extractor.py", line 95, in _process_single_content
2025-03-03 00:42:53 final_result = self._chat(hint_prompt, [{"role": "user", "content": "Output:"}], gen_conf)
2025-03-03 00:42:53 File "/ragflow/graphrag/general/extractor.py", line 61, in _chat
2025-03-03 00:42:53 response = self._llm.chat(system, hist, conf)
2025-03-03 00:42:53 File "/ragflow/api/db/services/llm_service.py", line 288, in chat
2025-03-03 00:42:53 txt, used_tokens = self.mdl.chat(system, history, gen_conf)
2025-03-03 00:42:53 File "/ragflow/rag/llm/chat_model.py", line 46, in chat
2025-03-03 00:42:53 response = self.client.chat.completions.create(
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_utils/_utils.py", line 274, in wrapper
2025-03-03 00:42:53 return func(*args, **kwargs)
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 704, in create
2025-03-03 00:42:53 return self._post(
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1260, in post
2025-03-03 00:42:53 return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 937, in request
2025-03-03 00:42:53 return self._request(
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1043, in _request
2025-03-03 00:42:53 return self._process_response(
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1142, in _process_response
2025-03-03 00:42:53 return api_response.parse()
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_response.py", line 314, in parse
2025-03-03 00:42:53 parsed = self._parse(to=to)
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/openai/_response.py", line 256, in _parse
2025-03-03 00:42:53 data = response.json()
2025-03-03 00:42:53 File "/ragflow/.venv/lib/python3.10/site-packages/httpx/_models.py", line 764, in json
2025-03-03 00:42:53 return jsonlib.loads(self.content, **kwargs)
2025-03-03 00:42:53 File "/usr/lib/python3.10/json/init.py", line 346, in loads
2025-03-03 00:42:53 return _default_decoder.decode(s)
2025-03-03 00:42:53 File "/usr/lib/python3.10/json/decoder.py", line 337, in decode
2025-03-03 00:42:53 obj, end = self.raw_decode(s, idx=_w(s, 0).end())
2025-03-03 00:42:53 File "/usr/lib/python3.10/json/decoder.py", line 355, in raw_decode
2025-03-03 00:42:53 raise JSONDecodeError("Expecting value", s, err.value) from None
2025-03-03 00:42:53 json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

Metadata

Metadata

Assignees

No one assigned

    Labels

    🐞 bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions