Releases: TabbyML/tabby
Releases Β· TabbyML/tabby
v0.18.0-rc.0
v0.18.0-rc.0
v0.17.0
β οΈ Notice
- We've reworked the
Web
(a beta feature) context provider into theDeveloper Docs
context provider. Previously added context in theWeb
tab has been cleared and needs to be manually migrated toDeveloper Docs
.
π Features
- Extensive rework has been done in the answer engine search box.
- Developer Docs / Web search is now triggered by
@
. - Repository Context is now selected using
#
.
- Developer Docs / Web search is now triggered by
- Supports OCaml
π« New Contributors
- @eryue0220 made their first contribution in #2921
- @th0rgall made their first contribution in #3041
- @tcmzzz made their first contribution in #3093
Full Changelog: v0.16.1...v0.17.0
v0.17.0-rc.6
v0.17.0-rc.6
v0.17.0-rc.5
v0.17.0-rc.5
v0.17.0-rc.4
v0.17.0-rc.4
v0.17.0-rc.3
v0.17.0-rc.3
v0.17.0-rc.2
v0.17.0-rc.2
v0.17.0-rc.1
v0.17.0-rc.1
v0.17.0-rc.0
v0.17.0-rc.0
v0.16.1
β οΈ Notice
- Starting from this version, we are utilizing websockets for features that require streaming (e.g., Answer Engine and Chat Side Panel). If you are deploying tabby behind a reverse proxy, you may need to configure the proxy to support websockets.
π Features
- Discussion threads in the Answer Engine are now persisted, allowing users to share threads with others.
π§° Fixed and Improvements
- Fixed an issue where the llama-server subprocess was not being reused when reusing a model for Chat / Completion together (e.g., Codestral-22B) with the local model backend.
- Updated llama.cpp to version b3571 to support the jina series embedding models.
π« New Contributors
- @richginsberg made their first contribution in #2738
- @VladislavNekto made their first contribution in #2804
- @Sherlock113 made their first contribution in #2809
- @zwpaper made their first contribution in #2812
- @cclauss made their first contribution in #2832
- @richard-jfc made their first contribution in #2835
Full Changelog: v0.15.0...v0.16.1