Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor: Refactor endpoint and model compatibility check #180

Merged
merged 10 commits into from
Mar 22, 2023

Conversation

j178
Copy link
Contributor

@j178 j178 commented Mar 20, 2023

  • Add model check for CreateCompletionStream and CreateChatCompletionStream.
  • Correct model check in CreateCompletion.

@codecov
Copy link

codecov bot commented Mar 20, 2023

Codecov Report

Merging #180 (d33e5c3) into master (4288394) will increase coverage by 0.54%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master     #180      +/-   ##
==========================================
+ Coverage   71.30%   71.84%   +0.54%     
==========================================
  Files          21       21              
  Lines         568      579      +11     
==========================================
+ Hits          405      416      +11     
  Misses        124      124              
  Partials       39       39              
Impacted Files Coverage Δ
chat.go 100.00% <100.00%> (ø)
chat_stream.go 88.00% <100.00%> (+3.00%) ⬆️
completion.go 100.00% <100.00%> (ø)
stream.go 76.00% <100.00%> (+6.00%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@sashabaranov
Copy link
Owner

Hey, thank you for the PR!

stream.go Outdated Show resolved Hide resolved
completion.go Outdated Show resolved Hide resolved
chat_stream.go Outdated Show resolved Hide resolved
chat_stream.go Show resolved Hide resolved
@j178
Copy link
Contributor Author

j178 commented Mar 21, 2023

Hi, I am wondering if it is necessary to have compatibility checks between the endpoint and model, or if it can be removed and let it fail. Here are some points to consider:

  1. OpenAI will regularly release new models, which requires us to update the checks here in a timely manner, or else it will hinder users from using the new models.
  2. If an incompatible model is used in the endpoint, OpenAI API will return a clear error message, which should be enough to inform users of the error.

@sashabaranov
Copy link
Owner

@j178 What if we check for "incompability" — basically return error only when we are sure that the model is not supported?

@j178
Copy link
Contributor Author

j178 commented Mar 22, 2023

basically return error only when we are sure that the model is not supported?

While technically possible, this approach would be very long and difficult to maintain. We would need to explicitly list all unsupported models for each endpoint check, like this:

func checkEndpointSupportsModel(endpoint, model string) error {
	switch endpoint {
	case "/completions":
		switch model {
		case GPT3Dot5Turbo0301, GPT3Dot5Turbo, GPT4, GPT40314, GPT432K, GPT432K0314:
			return ErrCompletionUnsupportedModel
		}
	case "/chat/completions":
		switch model {
		case GPT3TextDavinci003, GPT3TextDavinci002, GPT3TextCurie001, GPT3TextBabbage001, GPT3TextAda001, GPT3TextDavinci001, GPT3DavinciInstructBeta, GPT3Davinci, GPT3CurieInstructBeta, GPT3Curie, GPT3Ada, GPT3Babbage, CodexCodeDavinci002, CodexCodeCushman001, CodexCodeDavinci001:
			return ErrChatCompletionInvalidModel
		}
	}
	return nil
}

@sashabaranov
Copy link
Owner

sashabaranov commented Mar 22, 2023

@j178 here's a simpler approach:

type endpointModelPair struct {
	endpoint string
	model    string
}

var disabledPairs = map[endpointModelPair]bool{
	{"/completions", "gpt-3.5-turbo"}: true,
}

func main() {
	pairIsDisabled := disabledPairs[endpointModelPair{"/completions", "gpt-3.5-turbo"}]
	fmt.Println("Pair is disabled:", pairIsDisabled)
}

// Output:
// Pair is disabled: true

We can also use map[string]map[string]bool to have endpoint->model->disabled flag mapping.

@j178 j178 changed the title Add model check for chat stream refactor: Refactor endpoint and model compatibility check Mar 22, 2023
@j178
Copy link
Contributor Author

j178 commented Mar 22, 2023

I applied your suggestion, but I also had to introduce a breaking change: I deleted ErrChatCompletionInvalidModel and ErrCompletionUnsupportedModel, and replaced them with a unified ErrUnsupportedModel. Do you have any better suggestions?

completion.go Outdated Show resolved Hide resolved
completion.go Outdated Show resolved Hide resolved
@j178
Copy link
Contributor Author

j178 commented Mar 22, 2023

The current CI failure is quite peculiar(I was able to pass the test locally). The panic stack trace appears at stream_test.go:167 and stream_test.go:169, even though the stream_test.go file only has 165 lines in total.

=== RUN   TestCreateCompletionStreamError
    stream_test.go:167: CreateCompletionStream returned error: this model is not supported with this method, please use CreateChatCompletion client method instead
--- FAIL: TestCreateCompletionStreamError (0.00s)
panic: runtime error: invalid memory address or nil pointer dereference [recovered]
	panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0x88aac1]

goroutine 171 [running]:
testing.tRunner.func1.2({0x8c6f80, 0xbd9fc0})
	/opt/hostedtoolcache/go/1.19.7/x64/src/testing/testing.go:1396 +0x372
testing.tRunner.func1()
	/opt/hostedtoolcache/go/1.19.7/x64/src/testing/testing.go:1399 +0x5f0
panic({0x8c6f80, 0xbd9fc0})
	/opt/hostedtoolcache/go/1.19.7/x64/src/runtime/panic.go:890 +0x262
github.com/sashabaranov/go-openai_test.TestCreateCompletionStreamError(0xc00027fd40)
	/home/runner/work/go-openai/go-openai/stream_test.go:169 +0x581
testing.tRunner(0xc00027fd40, 0x954fa0)
	/opt/hostedtoolcache/go/1.19.7/x64/src/testing/testing.go:1446 +0x217
created by testing.(*T).Run
	/opt/hostedtoolcache/go/1.19.7/x64/src/testing/testing.go:1493 +0x75e
FAIL	github.com/sashabaranov/go-openai	0.196s
FAIL
Error: Process completed with exit code 1.

@sashabaranov
Copy link
Owner

In terms of panic: I don't even see the TestCreateCompletionStreamError in your fork, could you please sync your form with the master branch?

@j178
Copy link
Contributor Author

j178 commented Mar 22, 2023

Thanks, should be fixed now.

Copy link
Owner

@sashabaranov sashabaranov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work, thank you so much for the PR!

@sashabaranov sashabaranov merged commit 2ebb265 into sashabaranov:master Mar 22, 2023
@j178 j178 deleted the model-check branch March 22, 2023 13:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants