[Bug]: Vision Model Token Limit Incorrectly Defaults to GPT-4's 8187 #2496
Closed
Description
What happened?
The token limit is incorrectly defaulting to GPT-4's 8187 for the Vision model. When initiating a session with 'gpt-4-vision-preview', I anticipated that the system would apply the correct token limit for the Vision model, which should be higher than 8187.
Steps to Reproduce
- Select 'gpt-4-vision-preview' and start a chat session.
- Enter a long prompt that should exceed the 8187 token limit.
- Receive an error message stating that the token limit has been exceeded, contrary to expectations for the selected Vision model.
What browsers are you seeing the problem on?
Microsoft Edge
Relevant log output
No response
Screenshots
Code of Conduct
- I agree to follow this project's Code of Conduct