Skip to content
This repository was archived by the owner on Jan 2, 2025. It is now read-only.

Conversation

@ggordonhall
Copy link
Contributor

Previously we inserted every discovered code chunk into the explanation prompt. This iteratively adds recent chunks until the token count exceeds max_tokens - headroom, where headroom is set to 1500 to make room for the prompt text itself.

@ggordonhall ggordonhall merged commit 76072b9 into main May 26, 2023
@ggordonhall ggordonhall deleted the gabriel/blo-1025-trim-code-context branch May 26, 2023 13:42
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants