You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The context_length seems to be read correctly from the config, since it suppresses the sharding context_length error. However, I am still getting context length errors returned from openai, saying the corresponding text exceeds the token limit.
The text was updated successfully, but these errors were encountered:
with the following entry in my config file
The context_length seems to be read correctly from the config, since it suppresses the sharding context_length error. However, I am still getting context length errors returned from openai, saying the corresponding text exceeds the token limit.
The text was updated successfully, but these errors were encountered: