Releases: Vali-98/ChatterUI
v0.8.5
v0.8.5
This updated focuses on UI and code cleanup - there aren't any big functional changes to the app, but a lot has been reworked under the hood to keep the app maintainable for future updates. This is accumulation of all 0.8.5 beta releases.
Disclaimers:
- Your Logs will be reset with this new version. If you need them backed up, you should do it before updating.
- Your Model startup config will be reset
Features
- Added Custom Themes to ChatterUI! This system replaces the old hue-based selector. Feel free to discuss and share your themes here!
- By default, these themes are included:
- Lavender Dark
- Lavender Light
- AMOLED Black & White
- Navy Dark
- Pink
- Retro Green
- You can import custom themes so long as they respect ChatterUI's theme json format.
- Refer here on how to build custom themes!
- By default, these themes are included:
- Added
keep_alive
parameter for Ollama - Added
include_reasoning
parameter for OpenRouter - Added
<think> ... </think>
Markdown definitions to allow collapsing reasoning into an accordion. - KoboldCPP now accepts an API key by default when using the
--password
field on the server. Leave this empty if you don't use custom keys.
Changes
- Almost every screen was modified in some manor with touch-ups to the new styling system.
- Removed Legacy API system.
- Reworked how Logs are displayed and stored - logs should now be colored based on the Log Level
- Changed AnimatedEllipses to use base react-native Animated - this may be slightly buggy but prevents issues with Modals
- Removed fade up animation from chat messages to prevent a potential crash on longer chats.
Fixes
- Inability to create new Sampler presets.
- Key fields should now be secure and invisible
- Fixed DeepSeek-R1 Qwen 1.5b Distill crashing
v0.8.5-beta4
v0.8.5-beta4
This is a patch with few actual features aside underlying engine changes.
Disclaimers:
- Your Logs will be reset with this new version. If you need them backed up, you should do it before updating.
- Your Model startup config will be reset
Additions:
- Added
keep_alive
for Ollama
Changes:
- Reworked how Logs are displayed and stored - logs should now be colored based on the Log Level
- Changed AnimatedEllipses to use base
react-native
Animated - this may be slightly buggy but prevents issues with Modals
Fixes:
- Removed animations from the Chat Window for potential fix to #174
- Dropdown text colors being incorrect
- Ollama missing streaming prop
- Made keys invisible
v0.8.5-beta3
v0.8.5-beta3
This pre-release introduces the long awaited custom themes for ChatterUI!
Share your themes here: #218
Features:
- Added the
Change Theme
menu to select a different app color scheme. - Added several default themes:
- Lavender Dark
- Lavender Light
- AMOLED Black & White
- Navy Dark
- Pink
- Retro Green
- Added functionality to import custom Themes.
v0.8.5-beta2
This pre-release introduces a Light Mode to ChatterUI with the new experimental styling system. There aren't any major functional changes aside the styling of the app.
Additions:
- Added
include_reasoning
sampler for OpenRouter reasoning models. - Added custom markdown formatting for
<think> .... </think>
tags. They should now properly render in a collapsed accordion.
Changes:
- Touched up every single screen in the app and migrated them to the new styling system.
- Removed old 'Hue' based custom styling.
- Introduced experimental Light Mode
- For now, you can simply toggle between the default dark and light mode.
- In future, this will be a theme manager screen with many different built in themes and importable schemas.
- Reworked components to use new unified Themed components instead - this should make screens in the app feel more homogeneous.
- Removed Legacy API system.
Dev:
- Migrated more files into dedicated folders and utils.
- The current goal is to completely deprecate the old Global enum and Global.ts barrel file.
v0.8.5-beta1
This is a beta release which introduces experimental styling features and fixes.
Changes:
- A few screens have implemented the new styling system.
- Many components have been changed to the new styling system which does not support customization for now.
- KoboldCPP now accepts an API key by default when using the
--password
field on the server. Leave this empty if you don't use custom keys.
Fixes:
- Inability to create new Sampler presets.
Dev:
- Redid app file organization
- Renamed components for consistency
- Merged and removed many redundant components.
v0.8.4a
v0.8.4a
Quick patch:
- Updated cui-llama.rn DeepSeek R1 model support.
- Added default DeepSeek Instruct format.
v0.8.4
A small update with minor fixes and changes. Most changes are on the backend for maintainability purposes and preparation for future UI and localization reworks.
Changes:
- Updated cui-llama.rn - This should properly add support for Falcon models and fix a few issues with performance.
- Chats now export to .json instead of .txt.
- Sampler presets have been moved to a Zustand state rather than the ancient json file system. Sampler presets should automatically be migrated to the new system. This doesn't change much on the user-side, but may result in slightly faster transition time to the Sampler menu.
Fixes:
- swipe buttons being off-screen for smaller devices
- file sizes for models being cut off at 2GB - models will need to be reimported to fix this, but this bug is purely visual and does not impact performance.
- Flashing 'No Characters Found', index page and 'No Models Found' pages no longer appear.
- Slider values not clamping properly, leading to values such as
10.0000001
appearing. - Context Length for Chat Completions being calculated completely wrong.
- Reimplemented missing 'Bypass Context Length'.
Dev:
- prepared i18n packages and preparing for future UI and localization reworks
- refactored entire TTS state to be seperate from component
- developer builds now use a different app id to allow parallel installs with release version
v0.8.3
Preface:
Merry Christmas and Happy New Year! This will probably be the last release for this year (aside critical bug fixes) and I just want to say thanks to everyone who has supported the project! ChatterUI has grown from a simple curiosity to an awesome tool used by thousands, and it wouldn't have been possible without your support and feedback.
v0.8.3a
Fixes:
- SliderInput text not updating when changing Sampler presets.
- Custom API Templates incorrectly adding duplicate entries from the base OpenAI template.
v0.8.3
BREAKING CHANGE: Llama.cpp no longer supports Q4_0_4_4, Q4_0_4_8 and Q4_0_8_8. Instead, Q4_0 will automatically be aligned for ARM optimized kernels, use Q4_0 from now on!
This is a cummulative update for all beta features over the last month, including the major migration to Expo SDK 52 and the new React-Native architecture.
This update introduces the new experimental API Manager and API Configuration templates! This should allow you to add APIs missing from ChatterUI to a degree. This is mostly useful for APIs which are almost compliant with the OpenAI spec but may have extra or reduced sampler options. These configurations are sharable jsons which you can share to add new API compatibility to ChatterUI!
Known Issue:
- Softlock: If you have an Animated Ellipses visible during generations and open a menu that causes an alert, the alert will be invisible.
Features:
- Updated to Expo SDK 52. This is mostly an under-the-hood change, but should make the app feel a lot more responsive, some screens load much faster than before, and app startup should feel a lot quicker.
- Added a new API Manager!
- This API manager functions similar to the recently added Model manager.
- If you prefer the old API Manager, simply go to Settings > Use Legacy API
- You can now have multiple connection presets to the same API type
- You can now create your own API Configuration Templates! Refer to this discussion to give feedback and suggestions.
- Added an option for landscape mode in settings. This is automatically enabled for tablets.
- Added a unique component for editing string arrays such as
stop_sequence
. - Added a
Lock App
feature which will require user authentication via PIN/Pattern/Biometrics when starting the app.- This is disabled by default, and can be enabled in the Settings menu.
- Added
Notifications
on completion, which will let you know when generations are complete while the app is in the background.- This is disabled by default, and can be enabled in the Settings menu.
- You can toggle showing the message and character names from the notification or just show a generic 'Completion completed' notification.
Changes:
- Updated llama.cpp: This brings in a new feature which requantizes Q4_0 models into Q4_0_X_X for optimized arm kernels upon loading the model, removing the need for special quantizations.
- Dry Sampling is now available in Local Mode!
- Added a new dropdown bottom sheet. This will replace a few old dropdown menus in future.
- Instruct Menu now uses new dropdown and popup menu for button functions.
- Characters and Chats will no longer have
last_modified
update when accessed, only when edited or chatted. - Checkboxes have been changed to a new custom component.
- Changed animated ellipses during generations
- Updated to a new Markdown formatter! This formatter should be far better when dealing with mixed bullet types and nested lists.
- All inferencing is now done in a background task - this fixes an issue where generations are paused during the prompt building phase due to tabbing out of the app too quickly after a generation begins.
Fixes:
- Incorrect Gemma2 Instruct format.
- Exporting strings resulting is broken base64 files instead
- Cohere API being completely broken
- Fixed a softlock on Character List if you close search bar while a filter removes all characters from the list
- Fixed a crash on startup due to importing an unsupported model.
v0.8.3-beta5
v0.8.3-beta5
BREAKING FEATURE: Llama.cpp no longer supports Q4_0_4_4, Q4_0_4_8 and Q4_0_8_8. Instead, Q4_0 will automatically be aligned for ARM optimized kernels, use Q4_0 from now on.
Features:
- Reimplemented Horde for the new API system.
Changes:
- Updated Sliders which should fix buggy behavior.
- This should also fix tapping out of the slider textbox not updating the value.
- Sync llama.cpp:
- This removes support for Q4_0_4_4, Q4_0_4_8 and Q4_0_8_8. These models should now remain on Q4_0 only.
- ChatML is now the default instruct format on first install
- Editing chat should automatically scroll to bottom
Fixes:
- Legacy API system not working correctly.
- Incorrect Gemma2 Instruct format.
v0.8.3-beta4
v0.8.3-beta4
Just a small update with a few important fixes.
Changes:
- Instruct Menu now uses new dropdown and popup menu for button functions.
Fixes:
- The Samsung/Pixel header button bug for the API Manager.
- Exporting strings resulting is broken base64 files instead
- Incorrect response parsers for OpenAI, Text Completions and Chat Completions
- Checkboxes not updating properly
v0.8.3-beta3
v0.8.3-beta3
This update introduces the new experimental API Manager and API Configuration templates! This should allow you to add APIs missing from ChatterUI to a degree. This is mostly useful for APIs which are almost compliant with the OpenAI spec but may have extra or reduced sampler options. These configurations are sharable jsons which you can share to add new API compatibility to ChatterUI!
Features:
- Added a new API Manager!
- This API manager functions similar to the recently added Model manager.
- If you prefer the old API Manager, simply go to Settings > Use Legacy API
- You can now have multiple connection presets to the same API type
- You can now create your own API Configuration Templates! Refer to this discussion to give feedback and suggestions.
- Updated llama.cpp: This brings in a new feature which requantizes Q4_0 models into Q4_0_X_X for optimized arm kernels upon loading the model, removing the need for special quantizations.
Changes:
- Added a new dropdown bottom sheet. This will replace a few old dropdown menus in future.
- Characters and Chats will no longer have
last_modified
update when accessed, only when edited or chatted. - Checkbox styles have been updated!
Fixes:
- Fixed chat displaying upside-down.
- Fixed header buttons not working on Samsung and Pixel devices.
- Fixed Markdown formatter not applying properly in specific cases.
- Fixed a crash on startup due to importing an unsupported model.
Known Issues:
- Popups in screen headers may be incorrectly placed.
- Multiline TextInput fields will not resize to full height on a blank newline.
- During generations, if AnimatedEllipsis is visible, it will break Modals such as Alerts and TextBoxModals until the generation ends.