Skip to content

feat: default max output tokens for autocomplete #5789

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

uinstinct
Copy link
Contributor

@uinstinct uinstinct commented May 22, 2025

Description

Adds the ability to have a lower maximum output tokens for auto completion by default.

Closes #5449

References #4448 (comment)
References #3994 (comment)

Checklist

  • [] I've read the contributing guide
  • [] The relevant docs, if any, have been updated or created
  • [] The relevant tests, if any, have been updated or created

Screenshots

[ For visual changes, include screenshots. Screen recordings are particularly helpful, and appreciated! ]

Tests

[ What tests were added or updated to ensure the changes work as expected? ]


Summary by cubic

Set a lower default max output tokens limit (256) for autocomplete to reduce unnecessary token usage.

  • New Features
    • Autocomplete now uses a smaller max tokens value by default unless overridden.

@uinstinct uinstinct requested a review from a team as a code owner May 22, 2025 06:45
@uinstinct uinstinct requested review from Patrick-Erichsen and removed request for a team May 22, 2025 06:45
Copy link

netlify bot commented May 22, 2025

Deploy Preview for continuedev canceled.

Name Link
🔨 Latest commit 3ebd8f9
🔍 Latest deploy log https://app.netlify.com/projects/continuedev/deploys/682ec80bbfd12600096dfb21

@dosubot dosubot bot added the size:S This PR changes 10-29 lines, ignoring generated files. label May 22, 2025
@@ -73,6 +74,11 @@ export class CompletionProvider {
llm.completionOptions.temperature = 0.01;
}

// (DOES NOT WORK) llm.complettionOptions.maxTokens is already populated - need to detect if llm not have maxTokens already set
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

need some help here. i want to know if maxTokens in user's config was already set. However, BaseLLM sets the completion option's maxTokens.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:S This PR changes 10-29 lines, ignoring generated files.
Projects
Status: Todo
Development

Successfully merging this pull request may close these issues.

maxTokens for models used for autocomplete should default to a low setting
1 participant