Note: This project is in early development stage. So, there are many bugs.
luna-preview.mp4
- Run on terminal easlily
- Generated text will be saved in your clipboard
- Select model from ollama
- Ollama
- Node.js
- Clone or download the repository to your local machine. Make sure you have checked the prerequisites.
- Open terminal and navigate to the project directory.
- Run
npm installto install the dependencies. - Run
node setup.cjsto setup the project. This will create a command alias forluna(it is not permanent but in future updates it will be). - Run
luna chatto start chatting with Luna from the terminal.
As of now, the main objective of this project is to provide a simple and easy to use interface for ollama. In future updates, I will be adding more features to this project.
I wanted to run llm model locally, I am not a fan of a interface like WebUI or any other. Because I have to open a browser and then type my query and then copy the response, and also my machine is not that powerful to run a browser and a model at the same time ( it's not like it can't but anyway...). So I decided to create a simple terminal interface for ollama. I use the terminal a lot and I think it is the better solution for me. This way I just have write the query and get the response in my clipboard, and I can paste it anywhere I want.
This project is too stupid to have a license. Do whatever you want with it. I don't care.
Please roast me for my bad code. Create issues and pull requests to humiliate me. Maybe someday I will be the one who will be roasting you. Who knows?
Please draw a anime waifu character that fits the name "Luna" and I will make it the face of this project. ( Must have white hair)
