Skip to content

[BUG]assitant promt cause llm produce fake content #4181

@J-lena

Description

@J-lena

Description

We do not want the model to generate an Observation during tool calls,beause when we use flow streaming ,we can accept this unused content. We have already customized the tools prompt, but the assistant prompt is automatically added, which causes subsequent tool calls to fabricate the tool’s output

Steps to Reproduce

when use tools

Expected behavior

when usel tools,Observation shoued not generated by llm

Screenshots/Code snippets

Image

Operating System

Ubuntu 20.04

Python Version

3.10

crewAI Version

latest

crewAI Tools Version

xx

Virtual Environment

Venv

Evidence

xx

Possible Solution

xx

Additional context

xx

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions