Skip to content

Problem with system prompt when using Mode.JSON #1514

@dimentary

Description

@dimentary

Is your feature request related to a problem? Please describe.
I am using LLM for roleplay storywriting and also use mode.JSON for the models that don't support tool calling. When using this mode, InstructorAI inserts this text into the system prompt:

As a genius expert, your task is to understand the content and provide
the parsed objects in json that match the following json_schema:

{json.dumps(response_model.model_json_schema(), indent=2, ensure_ascii=False)}

Make sure to return an instance of the JSON, not the schema itself

The problem is the "As a genius expert" part. I understand that it might improve the quality of the LLM outputs, but in my case it breaks character consistency as now it acts as a "genius expert".

Describe the solution you'd like
A more generic system prompt for the JSON mode.

Describe alternatives you've considered
An ability to pass custom system prompt by the user.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingenhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions