-
Notifications
You must be signed in to change notification settings - Fork 30
Get model availability form endpoint instead of computing it manually #2386
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Get model availability form endpoint instead of computing it manually #2386
Conversation
|
||
// If the dropdown is already disabled, don't change it. It can happen | ||
// in the case of the legacy commands (updateAfterFirstMessage happens before this call). | ||
isEnabled = isEnabled && chatModelFromState == null | ||
|
||
isVisible = !isEnterpriseAccount || hasServerSentModels |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jamesmcnamara After cody commit bump this seems to be not working anymore.
Could you remind me what was the rationale for hasServerSentModels
introduction?
I think either way we should decide that on the Cody level, and if some conditions are not meet we should simply not sent models through this endpoint. This way we won't have custom, error-prone logic on the client side.
Also, looks like Cody is not allowing to choose between enterprise models so I'm changing this too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that these could be some leftovers related to chats (which are redundant now as we use the webview UI). You can doublecheck that Edit Code dialog works properly (this is the only place where we use the dropdown presently).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did: see the testplan :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes the only need for this flag was to ensure that enterprise users saw a dropdown menu for LLM options on the chat view if their enterprise instance supported it. If we are using the web view now, then it won't matter because the webview handles it itself.
## Changes This change will allow clients to know if model is useable for them with of manually fetching account tier and computing model availability. This makes client code simpler, ensures the same conditions are used by both Cody and clients, and is less error prone. ## Test plan This is only used in JetBrains currently. Testplan on the JetBrains side: sourcegraph/jetbrains#2386
Changes
Follow-up to https://github.com/sourcegraph/cody/pull/5804
Instead of manually fetching account tier and computing model availability we instead receiving it from agent.
This makes code simpler, ensures the same consitions are used by both Cody and JetBrains, and is less error prone.
Test plan
Free user
Pro
label and do the edit -> it should successPro
label and do the edit -> it should open the account upgrade websitePro use
Pro
labels anywhereEnterprise user
Pro
labels anywhereAttention: This is different than the current behaviour, but I checked Cody and it allows for model selection for the enterprise accounts so I think JetBrains behaviour drifted there.