Skip to content

Latest commit

 

History

History
86 lines (51 loc) · 3.45 KB

README.md

File metadata and controls

86 lines (51 loc) · 3.45 KB

Chat Circuit

Brief overview

🔍 Added a small feature to zoom in using mouse selection. Handy for looking at deep branches #ChatCircuit

👉 August 22, 2024

Short demos

Re-run all nodes in a branch

Chat Circuit now makes it possible to re-run a branch of your conversation with LLM with a different prompt. It supports all local LLMs running on @ollama
💾 👉 August 6, 2024

Generate Follow up questions

Implemented this idea in chat circuit. Here is a quick demo of the application along with generating follow up questions using #LLM August 20, 2024

Zoom in/out

🔍 Added a small feature to zoom in using mouse selection. Handy for looking at deep branches #ChatCircuit

👉 August 22, 2024

Minimap Support

#ChatCircuit Added a mini-map with the help of Sonnet 3.5 in @poe_platform.

Would have taken me days if not weeks to do it without any help. 🙏

~ 99% of code is written by Claude September 25, 2024

Export to JSON Canvas Document

Added option to export to #JSON Canvas document that can be imported by any supported application like @obsdmd / @KinopioClub

👉 September 26, 2024

Features

Multi-Branch Conversations Create and manage multiple conversation branches seamlessly.

Contextual Forking Fork conversation branches with accurate context retention.

Editor Features

Save and Load Diagrams

Undo and Redo

Zoom and Pan

Re-run nodes in a branch

It is possible to re-run all the nodes in a branch after changing the prompt it any node in the list.

Running the Application

To run this application, follow these steps:

Generate models configuration file

ollama list | tail -n +2 | awk '{print $1}' > models.conf

Install dependencies

python3 -m pip install -r requirements.txt

Run application

python3 main.py

Model Configuration

The LLM models available are loaded from models.conf in the current directory See models.conf.example

The default model is the first one in that list

You can also run this command to generate the models.conf file

ollama list | tail -n +2 | awk '{print "ollama_chat/"$1}' > models.conf

Note: If models.conf is not found, the application will use a default set of models.