Skip to content

[Bug]: PentAGI doesn't actually work well with custom models hosted locally such as Azure OpenAI models. #51

@Aishani20278

Description

@Aishani20278

Affected Component

AI Agents (Researcher/Developer/...)

Describe the bug

Configuring the custom LLMs section to include Azure OpenAI model deployments along with a custom config file doesn't actually configure pentagi to run and use Azure OpenAI. Not even a basic flow is being created.

Steps to Reproduce

  1. Go to the .env file and in the custom LLMs section fill out the following details -
    LLM_SERVER_URL=
    LLM_SERVER_KEY=
    LLM_SERVER_MODEL=
    LLM_SERVER_CONFIG_PATH=
    LLM_SERVER_LEGACY_REASONING=false

  2. Run PentAGI with the following changes.

System Configuration

Deployment type: Custom Deployment.
LLM Provider: Azure OpenAI

Logs and Artifacts

No response

Screenshots or Recordings

No response

Verification

  • I have checked that this issue hasn't been already reported
  • I have provided all relevant configuration files (with sensitive data removed)
  • I have included relevant logs and error messages
  • I am running the latest version of PentAGI

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions