Skip to content

Releases: TabbyML/tabby

v0.18.0-rc.0

19 Sep 03:48
Compare
Choose a tag to compare
v0.18.0-rc.0 Pre-release
Pre-release
v0.18.0-rc.0

v0.17.0

11 Sep 01:18
Compare
Choose a tag to compare

⚠️ Notice

  • We've reworked the Web (a beta feature) context provider into the Developer Docs context provider. Previously added context in the Web tab has been cleared and needs to be manually migrated to Developer Docs.
image

πŸš€ Features

  • Extensive rework has been done in the answer engine search box.
    • Developer Docs / Web search is now triggered by @.
    • Repository Context is now selected using #.
image
  • Supports OCaml

πŸ’« New Contributors

Full Changelog: v0.16.1...v0.17.0

v0.17.0-rc.6

10 Sep 21:23
Compare
Choose a tag to compare
v0.17.0-rc.6 Pre-release
Pre-release
v0.17.0-rc.6

v0.17.0-rc.5

10 Sep 03:02
Compare
Choose a tag to compare
v0.17.0-rc.5 Pre-release
Pre-release
v0.17.0-rc.5

v0.17.0-rc.4

09 Sep 08:26
Compare
Choose a tag to compare
v0.17.0-rc.4 Pre-release
Pre-release
v0.17.0-rc.4

v0.17.0-rc.3

06 Sep 21:48
Compare
Choose a tag to compare
v0.17.0-rc.3 Pre-release
Pre-release
v0.17.0-rc.3

v0.17.0-rc.2

05 Sep 23:54
Compare
Choose a tag to compare
v0.17.0-rc.2 Pre-release
Pre-release
v0.17.0-rc.2

v0.17.0-rc.1

05 Sep 23:00
Compare
Choose a tag to compare
v0.17.0-rc.1 Pre-release
Pre-release
v0.17.0-rc.1

v0.17.0-rc.0

04 Sep 07:50
Compare
Choose a tag to compare
v0.17.0-rc.0 Pre-release
Pre-release
v0.17.0-rc.0

v0.16.1

28 Aug 04:32
Compare
Choose a tag to compare

⚠️ Notice

  • Starting from this version, we are utilizing websockets for features that require streaming (e.g., Answer Engine and Chat Side Panel). If you are deploying tabby behind a reverse proxy, you may need to configure the proxy to support websockets.

πŸš€ Features

  • Discussion threads in the Answer Engine are now persisted, allowing users to share threads with others.

🧰 Fixed and Improvements

  • Fixed an issue where the llama-server subprocess was not being reused when reusing a model for Chat / Completion together (e.g., Codestral-22B) with the local model backend.
  • Updated llama.cpp to version b3571 to support the jina series embedding models.

πŸ’« New Contributors

Full Changelog: v0.15.0...v0.16.1