-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
π¦ v0.0.3 #215
Merged
Merged
π¦ v0.0.3 #215
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
- Initing a new type of workflow, a streaming (async) routing workflow using the Streaming Chat API as an example - Updated the Bruno collection - Updated the LanguageModel API to include `ChatStream()` and `SupportChatStream()` methods - Get the streaming router working - Implemented SSE event parsing to be able to work with OpenAI streaming chat API - Integrated OpenAI chat streaming into the Glide's streaming chat API - Covered the happy workflow by tests
β¦tStream (#166) - Separated sync and streaming chat schemas - Extracted assumptions on where to find latency from routing strategies to a separate `LatencyGetters` that can be different for different models/workflows - Elaborated the client provider `chatStream()` interface. Clients now expose a response channel instead of being provided with by caller - Connected the stream chat workflow to latency & health tracking - Refined the `chatStream()` method of clients to return a stream struct - Separated latency tracking of the streaming workflow from the sync chat workflow - defined a new `HealthTracker` to incorporate all health tracking logic
- Separated chat & chat stream request schemas - introduced a new finish reason field - added metadata to stream chat response - allow to attach some metadata to a chat stream request and then attach it to each chat stream chunk - adjusted error message schema to include request ID and metadata
β¦age (#184) - Fixed the header where Anthropic API key is passed - Started propagating token usage of Anthropic requests - Corrected the TokenUsage interface by changing count field to integers from floats
β¦penAI, Azure and Cohere (#194) - text length bound passed in request params - content moderation/toxicity - Cohere streaming workflow doesn't seem to be working as errMapper was not really initialized. I have fixed that in this PR - Cohere now ignores stream chunk types that Glide doesn't support like citation related stuff - Cohere stream chunks are not set with the correct model name (e.g. some placeholder was used before)
β¦re chat streams correctly (#201) - implementing a custom stream reader to correctly handle Cohere streams - Start handling the stream-start event to propagate generationID to all following chunks
β¦n case of some errors (#203) - Passed RouterID and ModelID information in the chat stream messages - Introduced a new ChatStreamMessage type that joins both chunk and error messages. Removed unneeded context from provider chatStream structs - defined a set of possible error codes during chat streaming - started simplifying logging by using context-based loggers - Introduced finish_reason on the error schema
- Fixed validation of nested arrays, so it can now reach all structures including provider params - Removed ChatHistory & ConversationID fields from the params - Added a bunch of other params like max_tokens, penalties, k, p, etc. - Added validations to some params
β¦g swagger.yaml file (#211) This change fixes panics like "./docs/swagger.yaml is not found"
# Conflicts: # README.md # docs/docs.go # docs/swagger.json # docs/swagger.yaml # go.mod # go.sum # pkg/api/http/handlers.go # pkg/api/http/server.go # pkg/api/schemas/chat_stream.go # pkg/gateway.go # pkg/providers/azureopenai/chat_stream.go # pkg/providers/azureopenai/client.go # pkg/providers/cohere/chat.go # pkg/providers/cohere/chat_stream.go # pkg/providers/cohere/chat_stream_test.go # pkg/providers/cohere/client.go # pkg/providers/cohere/config.go # pkg/providers/cohere/schemas.go # pkg/providers/cohere/testdata/chat_stream.success.txt # pkg/providers/lang.go # pkg/providers/openai/chat.go # pkg/providers/openai/chat_stream.go # pkg/providers/openai/client.go # pkg/providers/provider.go # pkg/providers/testing/lang.go # pkg/providers/testing/models.go # pkg/routers/config.go # pkg/routers/router.go # pkg/routers/router_test.go
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
β¨ Bringing support for streaming chat in Glide (integrated with OpenAI, Azure OpenAI and Cohere)
β¨ Started handling 401 errors to mark models as premaritally unavailable (e.g. when API key was not correct)
π Fixing the panic related to swagger.yaml file
π Fixing Anthropic chat workflow by passing API key correctly
π§ Improved Cohere param config and validation
Changelog
Added
Changed
Fixed
Security
Miscellaneous