Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Open Interpreter always reports on 'filename' #1037

@lukehinds

Description

@lukehinds

Describe the issue

There is a package on pypi reported as malicious on called filename

Near on every snippet run by interpreter can often filename in its system prompt, which triggers an alert.

python main.py
Loading mistral-nemo...

Model loaded.                                                                                                          

                                                                                                                       
  Warning: CodeGate detected one or more malicious packages.                                                           
                                                                                                                       
   • filename: https://www.insight.stacklok.com/report/pypi/filename?utm_source=codegate                               
                                                                                                                       
  critical vulnerability found, you must take action                                                                   
                                                                                                                       
Assistant's response: [{'role': 'assistant', 'type': 'message', 'content': '**Warning:** CodeGate detected one or more malicious packages.\n- filename: [https://www.insight.stacklok.com/report/pypi/filename?utm_source=codegate](https://www.insight.stacklok.com/report/pypi/filename?utm_source=codegate)\n\ncritical vulnerability found, you must take action'}]

Steps to Reproduce

python3 -m venv venv && source venv/bin/activate

pip install open-interpreter

from interpreter import interpreter

interpreter.offline = True
interpreter.llm.model = "ollama/mistral-nemo:latest"
interpreter.llm.api_base = "http://localhost:8989/ollama"

response = interpreter.chat("I don't like the color red, can you change it to blue?")
print("Assistant's response:", response) 

Operating System

Microsoft Windows (Intel)

IDE and Version

N/A

Extension and Version

N/A

Provider

Ollama

Model

N/A

Codegate version

main

Logs

No response

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions