Closed
Description
Describe the bug
After using for a couple of tens of minutes it gets. This model's maximum context length is 65536 tokens. However, you requested 69135 tokens (61135 in the messages, 8000 in the completion). Please reduce the length of the messages or completion.
Got the newest version of bolt.diy.Seen that overtime it overloads the context.
Link to the Bolt URL that caused the error
Steps to reproduce
- Try develop app with deepseek
- develop parts of the app
- after a couple of iteration like 20 30 iterations gives error
Expected behavior
Posibility to choose what amount of tokens to send. Seen that overtime it overloads.
Screen Recording / Screenshot
No response
Platform
- OS: [e.g. macOS, Windows, Linux]
- Browser: [e.g. Chrome, Safari, Firefox]
- Version: [e.g. 91.1]
Provider Used
No response
Model Used
No response
Additional context
No response