Releases: react-chatbotify-plugins/llm-connector
Releases · react-chatbotify-plugins/llm-connector
v0.3.2
v0.3.1
v0.3.0
Fixed:
- Fixed an issue where the @wllama/wllama package was causing issues for some users
Note:
WllamaProvider is no longer shipped by default with the plugin, primarily because packaging it into the plugin causes issues that are hard to resolve plugin-side. There's also a lack of practical use case for it currently, though the default implementation is still available for users to copy into their project here.
v0.2.0
Fixed:
- Fixed an issue where GeminiProvider's
responseFormat
field was required instead of optional - Fixed an issue where stop conditions do not abort bot streaming responses
- Fixed error message not respecting output type
Added:
- Added an
initialMessage
property within thellmConnector
attribute to allow users to specify an initial message easily