The Vector Database for Agents & Humans. Configuration-driven, backend-agnostic, and built for the future.
vecdb is a dual-interface vector database system:
- MCP Server: Connects to AI agents (Claude, IDEs, etc.) via the Model Context Protocol.
- CLI Tool: Gives humans and scripts direct power over their vector indices.
- Vecq: A specialized CLI for structural code querying (jq for code).
Uses Qdrant as the robust storage backend.
vecq is now available as a standalone tool! Read the Guide.
install.sh
vecdb ingest ./
docsize "How do I install use vecq?"Option A: Install via Cargo (Recommended)
cargo install --git https://github.com/daryltucker/vecdb vecdb-cli vecdb-server vecq docsizeOption B: Build from Source
(.venv) [ v0.0.9 ✭ | ● 174 ✚ 16 ]
✔ 23:09 daryl@Sleipnir ~/Projects/NRG/vecdb $ ./install.sh
=== Installing vecdb binaries ===
Target: ~/.cargo/bin
[1/4] Installing vecq (jq for source code)...
[2/4] Installing vecdb (CLI)...
[3/4] Installing vecdb-server (MCP)...
[4/4] Installing docsize (LLM context tool)...
=== Installation Complete ===
Installed:
- vecq (jq for source code)
- vecdb (CLI tool)
- vecdb-server (MCP server)
- docsize (LLM context tool)
Verify with: vecq --help && vecdb --help && docsize --help
=== Autocomplete Setup ===
Detected bash. To enable autocomplete, add this to your /home/daryl/.bashrc:
# vecdb completions
[ -f "/home/daryl/.local/share/vecdb/completions/vecdb" ] && . "/home/daryl/.local/share/vecdb/completions/vecdb"
[ -f "/home/daryl/.local/share/vecdb/completions/vecq" ] && . "/home/daryl/.local/share/vecdb/completions/vecq"
Would you like me to add this to your /home/daryl/.bashrc now? (y/N) y
Added to /home/daryl/.bashrc. Please restart your shell or run: source /home/daryl/.local/share/vecdb/completions/vecdb && source /home/daryl/.local/share/vecdb/completions/vecq
Tip: Use './install.sh --verbose' to see compilation outputSee docs/BUILDING.md.
Run the initialization command to set up your configuration:
vecdb init
# Creates ~/.config/vecdb/config.tomlYou need a running Qdrant instance.
Option A: Using Docker (Recommended) Use a meaningful Docker Volume for persistence:
docker run -d -p 6333:6333 \
-v vecdb_qdrant_data:/qdrant/storage \
qdrant/qdrantSee Examples README.md and docker-compose.qdrant
Option B: Manual / Cloud Install/Sign-up at qdrant.tech. Then update your config: Edit your config manually:
vim ~/.config/vecdb/config.tomlIngest your documents:
# Ingest a directory with concurrency control
vecdb ingest ./docs --collection my_knowledge -P 4 -G 2
# Note: Ingestion is OOM-protected.
# -P, --concurrency: Max parallel file processing tasks.
# -G, --gpu-concurrency: Max GPU embedding batch size (Prevents VRAM spikes).By default, vecdb is built with CUDA support enabled (via ort static linking).
-
Prerequisites:
- NVIDIA Drivers (v550+ recommended)
- NVIDIA CUDA Toolkit (
sudo apt install nvidia-cuda-toolkit) - NVIDIA cuDNN (
sudo apt install nvidia-cudnn) - Required for runtime execution.
-
Configuration:
- Set
local_use_gpu = truein~/.config/vecdb/config.toml(default). - No manual library paths needed: The ONNX Runtime is statically linked into the binary.
- Set
Tip: GPU is really not required, and you will still benefit from
vecdbwhen using the CPU embeddings. However, this feature is here for those who want or need it.
If you do not need GPU support or want to reduce binary size, you can disable the default CUDA features during build:
cargo install --path vecdb-cli --no-default-featuresNote:
vecdbusesortwith static linking. You do not need to setLD_LIBRARY_PATHor manually managelibonnxruntime.sofiles.
Note: You will still need the
libonnxruntime_providersref: GPU.md.
vecdb supports two ways to exclude files:
-
.vectorignore(Always Respected):- Works exactly like
.gitignore. - Place it in your project root or subdirectories.
- Example:
vecdb-asm/or*.secret.
- Works exactly like
-
.gitignore(Optional):- Use
--respect-gitignoreto also respect your git rules. - Disabled by default to allow ingesting code you might not commit (e.g., local docs).
- Use
Tip: See docs/CONFIG.md for advanced ignore rules.
Search:
# Standard semantic search
vecdb search "How do I configure profiles?" --collection my_knowledge
# Smart routing (multi-hop / filter detection)
vecdb search "latest rust files" --smart
# Pipe-friendly JSON output
vecdb search "auth policy" --json | jq .Tip: vecdb search returns raw embeddings. Use docsize to do a more proper search to show what these embeddings can do for your Agent (Even 1B or 4B models).
Check Status:
vecdb list
vecdb statusQuantization Management:
# Set Int8 quantization for a collection (persistent config)
vecdb config set-quantization my_coll scalar
# Apply optimization explicitly
vecdb optimize my_coll
# Check warnings for memory usage
vecdb listMore Examples: See docs/EXAMPLES.md and docs/CLI.md.
To use with an MCP client (like Claude Desktop or an IDE):
Command: vecdb-server
Arguments: --allow-local-fs (Optional, enables ingest_path tool)
Available Tools:
search_vectors: Semantic search.embed: Generate embeddings.ingest_path: Ingest local files/folders.ingest_history: Time-travel ingestion (Git).code_query: Analyze code structure withvecq.
See docs/MCP_SERVER.md for API details.
- EXAMPLES.md: Common usage patterns and tricks.
- CONFIG.md: Full configuration reference.
- BUILDING.md: Compile from source.
- vecq Guide: Manual for the
vecqcode query tool. - Specs: Detailed feature modules in
docs/specs/(e.g. Ingestion Design).
- Bug Reports: Please file an issue on GitHub.
- License: Business Source License 1.1 (Free for <$1M Revenue). See LICENSE.
"Configuration drives. Abstraction enables. Philosophy guides. Code follows."