Skip to content

[Bug]: Vision Model Token Limit Incorrectly Defaults to GPT-4's 8187 #2496

Closed
@ssuzuki-github

Description

What happened?

The token limit is incorrectly defaulting to GPT-4's 8187 for the Vision model. When initiating a session with 'gpt-4-vision-preview', I anticipated that the system would apply the correct token limit for the Vision model, which should be higher than 8187.

Steps to Reproduce

  1. Select 'gpt-4-vision-preview' and start a chat session.
  2. Enter a long prompt that should exceed the 8187 token limit.
  3. Receive an error message stating that the token limit has been exceeded, contrary to expectations for the selected Vision model.

What browsers are you seeing the problem on?

Microsoft Edge

Relevant log output

No response

Screenshots

image

Code of Conduct

  • I agree to follow this project's Code of Conduct

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions