-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Add dependency injection to ChatCompletionStream for improved testability #1011
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
Hi @sashabaranov 👋 , first time contributor here. Fixing up those lint issues now. |
…lity **Describe the change** This PR refactors the `ChatCompletionStream` to use dependency injection by introducing a `ChatStreamReader` interface. This allows for injecting custom stream readers, primarily for testing purposes, making the streaming functionality more testable and maintainable. **Provide OpenAI documentation link** https://platform.openai.com/docs/api-reference/chat/create **Describe your solution** The changes include: - Added a `ChatStreamReader` interface that defines the contract for reading chat completion streams - Refactored `ChatCompletionStream` to use composition with a `ChatStreamReader` instead of embedding `streamReader` - Added `NewChatCompletionStream()` constructor function to enable dependency injection - Implemented explicit delegation methods (`Recv()`, `Close()`, `Header()`, `GetRateLimitHeaders()`) on `ChatCompletionStream` - Added interface compliance check via `var _ ChatStreamReader = (*streamReader[ChatCompletionStreamResponse])(nil)` This approach maintains backward compatibility while enabling easier mocking and testing of streaming functionality. **Tests** Added comprehensive tests demonstrating the new functionality: - `TestChatCompletionStream_MockInjection`: Tests basic mock injection with the new constructor - `mock_streaming_demo_test.go`: A complete demonstration file showing how to create mock clients and stream readers for testing, including: - `MockOpenAIStreamClient`: Full mock client implementation - `mockStreamReader`: Custom stream reader for controlled test responses - `TestMockOpenAIStreamClient_Demo`: Demonstrates assembling multiple stream chunks - `TestMockOpenAIStreamClient_ErrorHandling`: Shows error handling patterns **Additional context** This refactoring improves the testability of code that depends on go-openai streaming without introducing breaking changes. The existing public API remains unchanged, but now supports dependency injection for testing scenarios. The new demo test file serves as documentation for users who want to mock streaming responses in their own tests. Lint fix
370f947
to
857bb0d
Compare
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #1011 +/- ##
==========================================
+ Coverage 85.77% 85.83% +0.06%
==========================================
Files 43 43
Lines 2291 2315 +24
==========================================
+ Hits 1965 1987 +22
- Misses 304 306 +2
Partials 22 22 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
// ChatStreamReader is an interface for reading chat completion streams. | ||
type ChatStreamReader interface { | ||
Recv() (ChatCompletionStreamResponse, error) | ||
Close() error | ||
} | ||
|
||
// ChatCompletionStream | ||
// Note: Perhaps it is more elegant to abstract Stream using generics. | ||
type ChatCompletionStream struct { | ||
*streamReader[ChatCompletionStreamResponse] | ||
reader ChatStreamReader | ||
} | ||
|
||
// NewChatCompletionStream allows injecting a custom ChatStreamReader (for testing). | ||
func NewChatCompletionStream(reader ChatStreamReader) *ChatCompletionStream { | ||
return &ChatCompletionStream{reader: reader} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sashabaranov Does this approach fulfill your intent in the Note
comment?
Describe the change
This PR refactors the
ChatCompletionStream
to use dependency injection by introducing aChatStreamReader
interface. This allows for injecting custom stream readers, primarily for testing purposes, making the streaming functionality more testable and maintainable.Describe your solution
The changes include:
ChatStreamReader
interface that defines the contract for reading chat completion streamsChatCompletionStream
to use composition with aChatStreamReader
instead of embeddingstreamReader
NewChatCompletionStream()
constructor function to enable dependency injectionRecv()
,Close()
,Header()
,GetRateLimitHeaders()
) onChatCompletionStream
var _ ChatStreamReader = (*streamReader[ChatCompletionStreamResponse])(nil)
This approach maintains backward compatibility while enabling easier mocking and testing of streaming functionality.
Tests
Added comprehensive tests demonstrating the new functionality:
TestChatCompletionStream_MockInjection
: Tests basic mock injection with the new constructormock_streaming_demo_test.go
: A complete demonstration file showing how to create mock clients and stream readers for testing, including:MockOpenAIStreamClient
: Full mock client implementationmockStreamReader
: Custom stream reader for controlled test responsesTestMockOpenAIStreamClient_Demo
: Demonstrates assembling multiple stream chunksTestMockOpenAIStreamClient_ErrorHandling
: Shows error handling patternsAdditional context
This refactoring improves the testability of code that depends on go-openai streaming without introducing breaking changes. The existing public API remains unchanged, but now supports dependency injection for testing scenarios. The new demo test file serves as documentation for users who want to mock streaming responses in their own tests.