Skip to content

Releases: campfirein/cipher

v0.3.0

28 Aug 04:16

Choose a tag to compare

Changelog

[0.3.0] - 2025-01-27

🚀 Features

  • Provided Full Supports for SSE and Streamable-HTTP Transports and Refactored README #193
  • Optimize PayLoad and Introduce New WorkSpace Environment Variables #195
  • Added ChromaDB Backend. #197
  • Adjusted Default values for Vector Stores and Adjust Docs #225
  • Added Pinecone Backend. #202
  • Added ChromaDB Pgvector Backend. #205
  • Added FAISS Backend. #217
  • Added Redis Backend. #218
  • Added Weaviate Backend. #225

🐛 Bug Fixes

  • Fixed AWS LLM provider not recognized at startup. #212
  • Fixed Streamable-HTTP MCP transport + Tool Panel payloads. #214

📝 Documentation

  • Refactored README and provided full docs in docs #193

[0.2.2] - 2025-08-08

🚀 Features

  • Provided Full Supports for SSE and Streamable-HTTP Transports and Refactored README #193
  • Optimize PayLoad and Introduce New WorkSpace Environment Variables #195
  • Added ChromaDB Backend. #197
  • Adjusted Default values for Vector Stores and Adjust Docs #225
  • Added Pinecone Backend. #202
  • Added ChromaDB Pgvector Backend. #205
  • Added FAISS Backend. #217
  • Added Redis Backend. #218
  • Added Weaviate Backend. #225

🐛 Bug Fixes

  • Fixed AWS LLM provider not recognized at startup. #212
  • Fixed Streamable-HTTP MCP transport + Tool Panel payloads. #214

📝 Documentation

  • Refactored README and provided full docs in docs #193

v0.2.2

07 Aug 18:27

Choose a tag to compare

Changelog

[0.2.2] - 2025-08-08

🚀 Features

  • Added Input Refinement Feature to Search Tools. #186

🐛 Bug Fixes

  • Fixed Tool Exposure on Aggregator mode. #182
  • Fixed MCP Endpoint Routing. #183
  • Fixed Redis Connection Timeout. #185

📝 Documentation

  • Added Workspace Memory Team Progress Tracking Use Case. #187

v0.2.1

06 Aug 03:24

Choose a tag to compare

Changelog

[0.2.1] - 2025-08-06

🚀 Features

  • Added LM Studio support for LLM and embedding fallback. #148
  • Added Optimizations, namely Tokenizer Caching, Connection Pooling, Lazy Loading. #153
  • Added PostgreSQL support for session persistence. #155
  • Added Workspace Memory System for team collaboration and project tracking #165
  • Added built-in bash command execution tool. #173
  • Added Fully Functional Web UI with session persistence #178

🐛 Bug Fixes

  • Fixed fallback functionality in session/embedding management. #151
  • Fixed Ollama 404 Connection Errors with Incorrect URL Endpoints. #162
  • Fixed Claude Desktop integration Error. #164

📝 Documentation

  • Added Tutorial Video on REAMDE.md. #157
  • Added User Guide for Workspace memory #168

v0.2.0

30 Jul 15:12

Choose a tag to compare

📝 Changelog

[0.2.0] - 2025-07-30

✨ Added

  • 🧠 Multi-backend conversation support
    Added support for persistent memory across multiple LLM backends using a new architecture. Enables PostgreSQL-based WAL persistence.

  • 🔮 Google Gemini support
    Introduced support for Gemini LLMs and embedding models.

  • ☁️ Alibaba Cloud Qwen support
    Added compatibility for Qwen models.

  • 🧬 Gemini & Ollama embedding providers
    Cipher now supports embedding generation via Google Gemini and Ollama APIs.

  • 🛡️ Embedding fallback mechanism
    Implements automatic fallback logic for embeddings. If the primary provider is unavailable, Cipher selects the next available provider based on environment variables.

  • 📦 Prompt Provider Support
    Added extensible system prompt architecture with dynamic, static, and file-based providers. Enables customizable prompt injection through CLI commands (/prompt-providers) with support for conversation summaries, project guidelines, and real-time prompt management.

  • 📊 Token Management
    Implemented intelligent token counting and context compression with provider-specific tokenization. Features automatic compression when approaching context limits, token usage statistics, and configurable compression strategies for optimal memory management across different LLM providers.

  • 🧩 Aggregator Mode support
    Implements Aggregator mode for Cipher's MCP server, which exposes all Cipher's tools to agents/clients.


📚 Documentation

  • 🚀 New example: MCP aggregator hub (Use Case 4)
    Introduced a new real-world example in examples/usecase-4, showing how Cipher can aggregate multiple MCP streams.

  • 📝 Improved README and prompt documentation
    Updated descriptions, prompt templates, and environment variable explanations to reflect the latest architecture and provider support.


v0.1.1 - Initial Release

16 Jul 13:15

Choose a tag to compare

Cipher is a memory-powered AI agent framework designed specifically for coding
agents with MCP (Model Context Protocol) integration.

✨ Key Features

🧠 Dual Memory System

  • System 1: Programming Concepts & Business Logic & Past Interactions
  • System 2: Reasoning steps of the model when generating code
  • Auto-generated memories that scale with your codebase

🔌 IDE Integration

  • MCP Integration: Full Model Context Protocol support for tools and
    resources
  • Universal IDE Support: Works with Cursor, Windsurf, Claude Desktop, Claude
    Code, Gemini CLI, VS Code, and Roo Code
  • Zero Configuration: Install on your IDE with zero setup needed
  • Seamless Switching: Switch between IDEs without losing memory

🛠️ Multiple Operation Modes

  • Interactive CLI: cipher for interactive sessions
  • One-shot Commands: cipher "your prompt"
  • REST API Server: cipher --mode api
  • MCP Server: cipher --mode mcp

🤖 Multi-LLM Support

  • OpenAI: GPT-4 Turbo and other models
  • Anthropic: Claude 3.5 Sonnet and other models
  • OpenRouter: 200+ models available
  • Ollama: Self-hosted models (no API key needed)

📊 Knowledge Graph Integration

  • Neo4j Support: Structured memory with entity relationships
  • In-memory Option: Lightweight alternative for development
  • Real-time Learning: Memory layers that improve automatically

🔧 Session Management

  • Create, switch, and manage multiple conversation sessions
  • Persistent memory across sessions
  • Team workspace sharing in real-time

🚀 Installation

NPM Package (Recommended)

# Install globally
npm install -g @byterover/cipher

# Or install locally in your project
npm install @byterover/cipher

Docker

git clone https://github.com/campfirein/cipher.git
cd cipher
cp .env.example .env
docker-compose up -d

From Source

pnpm i && pnpm run build && npm link

📖 Usage Examples

CLI Interactive Mode

cipher

One-shot Command

cipher "Add this to memory as common causes of 'CORS error' in local dev with 
Vite + Express."

API Server Mode

cipher --mode api

MCP Server Mode

cipher --mode mcp

🔧 Configuration

Configure with environment variables and YAML:

Environment Variables

OPENAI_API_KEY=your_openai_api_key
ANTHROPIC_API_KEY=your_anthropic_api_key
OPENROUTER_API_KEY=your_openrouter_api_key
OLLAMA_BASE_URL=http://localhost:11434/v1

Agent Configuration (memAgent/cipher.yml)

llm:
  provider: openai
  model: gpt-4-turbo
  apiKey: $OPENAI_API_KEY

systemPrompt: 'You are a helpful AI assistant with memory capabilities.'

mcpServers:
  filesystem:
    type: stdio
    command: npx
    args: ['-y', '@modelcontextprotocol/server-filesystem', '.']

🎯 Use Cases

- Claude Code Integration: Seamless MCP integration with persistent memory
- Coding Agent Memory: Enhanced memory for coding agents like Kimi K2
- Team Collaboration: Shared memory workspace across teams
- Multi-IDE Development: Consistent memory across different development
environments

🔗 Links

- Documentation: https://docs.byterover.dev/cipher/overview
- Discord Community: https://discord.com/invite/UMRrpNjh5W
- Examples: Check the examples/ directory for detailed usage examples

📝 License

Elastic License 2.0

---
🌟 If you enjoy Cipher, please give us a star on GitHub—it helps a lot!