Skip to content

vanvuongngo/ollama-tauri-qwik

Repository files navigation

AI Agent for Ollama 2025

Demo of using Ollama + Tauri v2 + Qwik

This demo is inspired by the project ollama-tauri-client from Ilya F.

Here we are using Ollama, Tauri with the Qwik frontend framework instead of Svelte.

ollama-tauri-qwik

Get Started

Prerequisites

  1. Rust, see https://www.rust-lang.org/learn/get-started
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
  1. nodejs, see https://nodejs.org/en/download

  2. pnpm (optional), see https://pnpm.io/installation

npm install -g pnpm

For the qwik frontend (already done in this demo)

pnpm add -D -w @tauri-apps/api # or `cd apps/ollama-tauri-qwik && npm i -D @tauri-apps/api` 
  1. Running Ollama local or configure the connection here config

Start app

To run in development mode

pnpm dev # or `cd apps/ollama-tauri-qwik && npm tauri dev`

To package the desktop application for installations

pnpm build # or `cd apps/ollama-tauri-qwik && npm tauri build`

The pros of this tech-stack

  • Tauri is more secure and fast because it is based on Rust and it has a quite small app size by using the OS's native web renderer

  • Qwik is a revolution in web frontend technology ...

    • the fastest and modern frontend framework I know
    • it keeps to be fast, you can count of that the performance keeps even with more business features
    • it is secure and highly scaleable as a static site generated (SSG) because Tauri has to serve static files
  • Ollama is open source and fast tool to use LLMs locally, e.g. Meta's Llama 3.2 or DeepSeek-R1

    • that means your prompts and data stay fully private in case you have to be complient not to share them to other companies and countries (cloud-provider)
    • also to keep your privacy about your ideas, your thoughts, your private topics ... will not be shared by anyone
    • if you have a requirement to have low latency or you have a lot of data which you can not pass to the internet or your internet connection is not available/ reliable (edge computing)
    • also it is an option to lower your AI cost by hosting your AI on premises
    • or you need your own embedded data (images, pdf ...) which can not be shared to another company and country
    • it can also be an option if you are worried about potential Censorship