A self-hosted bookmark management application that helps you organize, search, and discover your saved links. Built with Rust, featuring AI-powered summarization and tagging to make your bookmarks more useful and discoverable.
- Offline-First: Store and manage bookmarks entirely on your own infrastructure
- AI-Powered Organization: Automatic tagging and summarization using Ollama integration
- RAG-Enhanced Search: Intelligent search using Retrieval-Augmented Generation to find relevant bookmarks based on semantic similarity
- Full-Text Search: Search through bookmark titles, URLs, content, and AI-generated summaries
- Tag Management: Organize bookmarks with manual and AI-suggested tags
- Content Extraction: Automatically extract and store readable content from web pages
- Modern Web Interface: Responsive WebAssembly-based frontend built with Yew
- REST API: Complete API for programmatic access and integrations
- CLI Tools: Command-line interface for batch operations and automation
The easiest way to get started is using Docker Compose:
# Build the application container
$ just build-container
# Start the full stack (database + application)
$ docker compose upThis will start:
- PostgreSQL database on port 5432
- Browserless Chrome automation service on port 3001
- Bookmark Hub server on port 3000
- Web interface accessible at http://localhost:3000
If you already have Ollama running on your host machine, use the alternative compose file:
# Build the application container
$ just build-container
# Start with host Ollama integration
$ docker compose -f docker-compose.host-ollama.yml upThis configuration:
- Uses
network_mode: hostfor direct access to host services - Connects to Ollama running at
localhost:11434 - Uses
pgvector/pgvector:pg17for vector embedding storage - Configures embedding model as
nomic-embed-text:v1.5for RAG features
For local development with hot reloading:
- Rust toolchain with
wasm32-unknown-unknowntarget - Just task runner
- Trunk for WebAssembly builds
- PostgreSQL database
- Optional: Ollama for AI features
-
Start PostgreSQL database (locally or via Docker)
-
Start the server:
$ just run-server
This starts the API server with development configuration.
-
Start the frontend (in a separate terminal):
$ just run-spa
This starts the development server at http://localhost:8080 with hot reloading.
The server accepts configuration via environment variables:
# Database
export PG_HOST=localhost
export PG_PORT=5432
export PG_USER=your_user
export PG_PASSWORD=your_password
export PG_DATABASE=bookmark_hub
# Application
export HMAC_KEY=your_secret_key
export APP_DATA_DIR=/path/to/data
# Optional: AI Features
export OLLAMA_URL=http://localhost:11434
export OLLAMA_TEXT_MODEL=gemma3:4b
export OLLAMA_EMBEDDING_MODEL=nomic-embed-text:v1.5
# Optional: Chrome Automation
export CHROME_HOST=localhost
export CHROME_PORT=3001# Login (required first)
$ just run-cli login --url http://localhost:3000 --username your_user --password your_password
# Add a single bookmark
$ just run-cli add --url https://example.com
# Add multiple bookmarks from a file (one URL per line)
$ just run-cli add-batch --file urls.txtRun end-to-end tests using Hurl (requires running application):
$ hurl --verbose --test test.hurl- Server: Axum-based REST API with background processing daemons
- Frontend: Yew WebAssembly application for modern web experience
- Database: PostgreSQL for reliable data storage with vector embeddings
- AI Integration: Ollama for content summarization, tag generation, and embedding creation
- RAG System: Vector similarity search using embeddings for intelligent bookmark discovery
- Content Processing: dom_smoothie for web page content extraction
- Browser Automation: Browserless Chrome for reliable web page rendering and content extraction