You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Chat API allows for not passing a max_tokens param and it's supported for other LLMs in langchain by passing -1 as the value. Could you extend support to the ChatOpenAI model? Something like the image seems to work?