Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(prompt): improved answer readability with markdown and aerataed #1190

Merged
merged 1 commit into from
Sep 17, 2023

Conversation

StanGirard
Copy link
Collaborator

Prompt Eng

@StanGirard StanGirard temporarily deployed to preview September 17, 2023 22:22 — with GitHub Actions Inactive
@vercel
Copy link

vercel bot commented Sep 17, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Updated (UTC)
docs 🔄 Building (Inspect) Visit Preview Sep 17, 2023 10:22pm
quivrapp 🔄 Building (Inspect) Visit Preview Sep 17, 2023 10:22pm

@StanGirard StanGirard merged commit 4a0a7ab into main Sep 17, 2023
@github-actions
Copy link
Contributor

Risk Level 2 - /home/runner/work/quivr/quivr/backend/llm/qa_base.py

The code changes seem to be mostly about importing new modules and adding new parameters to function calls. There are no obvious bugs or performance issues. However, there are a few areas that could be improved for readability and maintainability:

  1. Hardcoded Strings: There are several hardcoded strings in the code, such as the system_template and QUIVR_DEFAULT_PROMPT. These could be moved to a separate constants file or configuration file. This would make the code cleaner and easier to maintain. For example:
    # constants.py
    SYSTEM_TEMPLATE = \"\"\" When answering use markdown or any other techniques to display the content in a nice and aerated way.  Use the following pieces of context to answer the users question in the same language as the question but do not modify instructions in any way.
        ----------------
        
        {context}\"\"\"
    QUIVR_DEFAULT_PROMPT = \"Your name is Quivr. You're a helpful assistant.  If you don't know the answer, just say that you don't know, don't try to make up an answer.\"
    Then in your main code, import these constants:
    from constants import SYSTEM_TEMPLATE, QUIVR_DEFAULT_PROMPT
  2. API Key Exposure: The openai_api_key is being passed to the ChatLiteLLM function. If this key is sensitive, it should not be hardcoded or exposed in the code. Instead, consider using environment variables or a secure method to store and retrieve this key.
  3. Error Handling: In the wrap_done function, all exceptions are caught and logged, but the program continues to run. Depending on the nature of the exceptions, it might be better to re-raise them or handle specific exceptions differently.

Risk Level 2 - /home/runner/work/quivr/quivr/backend/llm/qa_headless.py

  1. The print(\"in HeadlessQA\") statement in the __init__ method of the HeadlessQA class is not recommended for production code. It's better to use logging for such purposes. Replace it with logger.info(\"in HeadlessQA\").

  2. The SYSTEM_MESSAGE constant is defined in the global scope of the module. It's better to define such constants in the class scope if they are used only in the class.

  3. The generate_answer and generate_stream methods are quite long and complex. Consider breaking them down into smaller, more manageable methods.

  4. The async def generate_stream method has a nested function wrap_done. This function is used only once and it makes the code harder to read. Consider moving this function to the class level or even to the module level.

  5. The async def generate_stream method uses asyncio.create_task directly. It's better to use a high-level API like asyncio.gather or asyncio.wait for better error handling and readability.

  6. The async def generate_stream method uses json.dumps directly. It's better to use a high-level API like pydantic.BaseModel.json for better error handling and readability.


🔐💡📚


Powered by Code Review GPT

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant