Skip to content

bug: Prometheus reports LLM metrics even when LLM plugin is not enabled #12953

@qiqizjl

Description

@qiqizjl

Current Behavior

When the LLM plugin is not enabled, the Prometheus exporter still reports LLM metrics (apisix_llm_latency, apisix_llm_prompt_tokens, apisix_llm_completion_tokens) with value 0.

This happens because:

  1. The nginx variables (llm_time_to_first_token, llm_prompt_tokens, llm_completion_tokens) have a default value of "0" (set in apisix/cli/ngx_tpl.lua)
  2. The condition check in apisix/plugins/prometheus/exporter.lua (L376-L394) only checks if the value is not empty (~= "")

As a result, all requests (even non-LLM requests) generate LLM metrics with 0 values, polluting the metrics data.

Expected Behavior

LLM metrics should only be reported when the LLM plugin is actually used and processing requests.

Discussion on Potential Solutions

Solution 1: Change default value from "0" to ""

Pros: Simple change
Cons: May affect users' existing JSON log format (as mentioned in #12841)

Solution 2: Check ~= "0" instead of ~= ""

Pros: No impact on log format
Cons: If an LLM request fails or returns 0 tokens (e.g., error response), the request won't be counted in metrics. This loses visibility into failed/abnormal LLM requests.

Solution 3: Add a dedicated flag variable (e.g., llm_request_active)

Set this flag to "1" only when LLM plugin processes the request, then check this flag before reporting metrics.
Pros: Accurate detection, no side effects
Cons: Requires adding a new nginx variable

Solution 4: Check request_type variable

Only report LLM metrics when request_type is "ai_stream" or "ai_chat".
Pros: Uses existing variable
Cons: Need to verify if request_type is reliably set

I would appreciate guidance on the preferred approach before submitting a PR.

Related PR: #12841

Steps to Reproduce

  1. Start APISIX with Prometheus plugin enabled (without any LLM plugin configured)
  2. Send a normal HTTP request to any route
  3. Check Prometheus metrics endpoint
  4. Observe that LLM metrics are reported with value 0

Environment

  • APISIX version: master branch
  • Related code:
    • apisix/cli/ngx_tpl.lua (default variable values)
    • apisix/plugins/prometheus/exporter.lua (L376-L394)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingplugin

    Type

    No type

    Projects

    Status

    📋 Backlog

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions