Skip to content

Releases: react-chatbotify-plugins/llm-connector

v0.3.2

12 Jun 00:21
Compare
Choose a tag to compare

Fixed:

  • Fixed an issue with autofocus applying even when embedded chatbot is out of view

v0.3.1

10 Jun 16:26
Compare
Choose a tag to compare

Fixed:

  • Fixed an issue with OpenAI Provider not working with responseFormat set to json

Added:

  • Added an optional debug property to all 3 default providers that prints more verbose logs that may help during development

v0.3.0

31 May 18:46
Compare
Choose a tag to compare

Fixed:

  • Fixed an issue where the @wllama/wllama package was causing issues for some users

Note:

WllamaProvider is no longer shipped by default with the plugin, primarily because packaging it into the plugin causes issues that are hard to resolve plugin-side. There's also a lack of practical use case for it currently, though the default implementation is still available for users to copy into their project here.

v0.2.0

15 May 17:44
Compare
Choose a tag to compare

Fixed:

  • Fixed an issue where GeminiProvider's responseFormat field was required instead of optional
  • Fixed an issue where stop conditions do not abort bot streaming responses
  • Fixed error message not respecting output type

Added:

  • Added an initialMessage property within the llmConnector attribute to allow users to specify an initial message easily

v0.1.1

12 May 19:02
Compare
Choose a tag to compare

Added:

Initial Release!