-
Notifications
You must be signed in to change notification settings - Fork 11.2k
Labels
bugSomething isn't workingSomething isn't working
Description
Description
When a session exceeds the model's context limit, the GitLab provider throws TOKEN_REFRESH_NEEDED instead of reporting the actual error.
Actual error (from logs):
400 {"type":"invalid_request_error","message":"prompt is too long: 218390 tokens > 200000 maximum"}
Displayed error:
GitLabError: TOKEN_REFRESH_NEEDED
Root cause: The error handling in @gitlab/gitlab-ai-provider appears to catch non-401 errors and incorrectly classify them as token issues.
Impact:
- Users spend significant time debugging authentication issues when the real problem is context overflow
- This also affects fix(provider): add OAuth token refresh support for GitLab provider #9949 (OAuth token refresh support) - if that PR lands, it would attempt to refresh tokens on context overflow errors, which would fail and still not surface the real problem
Expected behavior:
- 400 "prompt too long" errors should be reported as context overflow
- TOKEN_REFRESH_NEEDED should only be thrown for actual 401 authentication errors
- Ideally, suggest starting a new session or compacting context
Environment: - opencode version: 1.1.51
- Provider: gitlab/duo-chat-opus-4-5
Plugins
@gitlab/opencode-gitlab-plugin
OpenCode version
1.1.51
Steps to reproduce
- Configure opencode with GitLab Duo as the LLM provider
- Start a session and have a long conversation (or use tools that return large outputs) until context approaches 200k tokens
- Send another message
- Observe GitLabError: TOKEN_REFRESH_NEEDED error instead of a context overflow message
Screenshot and/or share link
Operating System
Ubunut 24.04.3 LTS (Noble Numbat)
Terminal
xterm-256color (ZSH)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working