Skip to content

deepseek-coder-v2 doesn't seem to get the context #162

@cprn

Description

@cprn

Hi. I tried code generation with llama3 model and it worked fine but with a custom Ollama agent running deepseek-coder-v2 it seems that the system_prompt isn't send, therefore the received response contains more than just a code snippet. Is this a bug or am I doing something wrong?

The relevant part of the inspect method with a rather empty payload (full output in attachment):

  _queries = {
    ["21840817_687b_40e1_9cda_faacb9efd554"] = {
      buf = 5,
      ex_id = 1,
      first_line = 0,
      handler = <function 29>,
      last_line = 0,
      ns_id = 27,
      on_exit = <function 30>,
      payload = {
        messages = { {
            content = "",
            role = "system"
          }, {
            content = "What's your system_prompt",
            role = "user"
          } },
        model = "deepseek-coder-v2",
        stream = true,
        temperature = 1,
        top_p = 1
      },
      provider = "ollama",
      [...cut...]
      response = " As an intelligent assistant DeepSeek Coder developed by the Chinese company DeepSeek, my system prompt is designed to provide information assistance and answer questions. My responses are based on artificial intelligence algorithms trained with large amounts of data. I am ready to assist users in acquiring knowledge and solving problems through dialogues.",
      timestamp = 1721658433
    }
  }

And this is my config (using Lazy):

    {'robitx/gp.nvim',
        config = function()
            local conf = {
                providers = {
                    openai = {disable = true},
                    googleai = {disable = true},
                    ollama = {disable = false},
                },
                agents = {
                    {name = "ChatOllamaLlama3", disable = true},
                    {name = "CodeOllamaLlama3", disable = true},
                    {
                        provider = "ollama",
                        name = "ChatOllamaDeepseekCoderV2",
                        chat = true,
                        command = false,
                        model = {
                            model = "deepseek-coder-v2",
                            num_ctx = 8192,
                        },
                        system_prompt = "You are a general AI assistant.",

                    },
                    {
                        provider = "ollama",
                        name = "CodeOllamaDeepseekCoderV2",
                        chat = false,
                        command = true,
                        model = {
                            model = "deepseek-coder-v2",
                            temperature = 1.9,
                            top_p = 1,
                            num_ctx = 8192,
                        },
                        system_prompt = "You are an AI working as a code editor providing answers.\n\n"
                            .. "Use 4 SPACES FOR INDENTATION.\n"
                            .. "Please AVOID COMMENTARY OUTSIDE OF THE SNIPPET RESPONSE.\n"
                            .. "START AND END YOUR ANSWER WITH:\n\n```",
                    },
                },
            }
            require("gp").setup(conf)
        end,
    },

full GpInspectPlugin output

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions