Build intelligent AI applications with persistent semantic memory
Get Started at kinic.io •
API Docs •
Live Demos
Kinic is a revolutionary semantic memory platform that enables AI agents to store, retrieve, and share knowledge across sessions and applications. Think of it as a brain for your AI systems - a centralized memory layer that makes your agents truly intelligent and context-aware.
Sign up at www.kinic.io to get your API key and start building with Kinic today. Your API key unlocks:
- ✅ Unlimited memory storage for your AI agents
- ✅ Cross-session persistence - agents remember everything
- ✅ Multi-agent orchestration - agents share knowledge seamlessly
- ✅ Real-time sync across all your applications
- ✅ Enterprise-grade security for your data
Traditional AI agents are stateless - they forget everything between sessions. Building context-aware, intelligent applications requires complex infrastructure for memory management, retrieval, and synchronization.
Kinic provides a simple API that gives your AI agents perfect memory. Store anything, retrieve everything, and build applications that truly understand and remember your users.
# Store knowledge
kinic.store({
"user_preference": "dark_mode",
"context": "ui_settings",
"metadata": {"timestamp": "2024-01-15"}
})
# Retrieve with natural language
memories = kinic.query("what UI preferences does the user have?")Connect multiple AI agents (OpenAI, Anthropic, Perplexity) to a shared knowledge base. Agents can:
- Share discoveries and insights
- Collaborate on complex tasks
- Maintain context across conversations
- Build on each other's knowledge
This repository includes a stunning visualization of your Kinic brain and connected agents. Watch knowledge flow in real-time as your agents interact with the memory layer.
Watch how two AI agents collaborate using Kinic's memory layer to discover and share HuggingFace models:
In this simple yet powerful demonstration:
- Agent 1 discovers and saves HuggingFace model information to Kinic
- Agent 2 automatically finds and uses that knowledge through semantic search
- No direct communication needed - pure memory-based collaboration
- Real-world example of how Kinic enables AI teamwork
| Demo | Description | Try It |
|---|---|---|
| Brain Visualization | Watch your AI agents orbit around the Kinic memory core with real-time data flow | Launch → |
| Google A2A Protocol | See enterprise-grade agent-to-agent communication with memory persistence | Launch → |
| Agent Discovery | Watch agents automatically discover and connect with each other | Launch → |
Not just a visualization - a fully functional mission control for your AI agents:
- Configure Tab: Connect and authenticate AI agents (OpenAI, Anthropic, Perplexity)
- Command Tab: Send real-time commands to your agent network
- Tasks Tab: Set up automated workflows and background jobs
- Live Metrics: Monitor agent activity, connections, and memory usage
- Try it Live →
Working Flask API server with real AI platform integrations:
# Just 4 lightweight dependencies!
flask==3.0.0
flask-cors==4.0.0
openai==1.12.0
anthropic==0.18.0- Actual API connections (not mocked)
- Test endpoints for verification
- CORS-enabled for web apps
- Ready to deploy today
- Google A2A Protocol Support: Works with industry-standard agent communication
- See A2A Demo →
- Multi-Platform: OpenAI, Anthropic, Perplexity, and more
- Secure: API keys stored locally, never transmitted unnecessarily
Complete testing and debugging infrastructure:
- test-agent.html: Standalone API connection tester
- Health check endpoints: Monitor system status
- Real-time error reporting: Clear feedback when things go wrong
- Coordinate calibration tools: For UI automation testing
Use your Kinic Chrome extension from local apps via Chrome Native Messaging.
- Plain-English background: This adds a tiny Python “bridge” that Chrome launches on demand. Your extension keeps a persistent connection to it, and the bridge exposes a simple local HTTP API. Any desktop app can now ask the extension to store the active tab (or a URL) and retrieve results—without fragile UI automation.
- How it works at a glance:
- The extension opens a native port to
com.kinic.apiand listens for{ id, action, params }. - The native host exposes
http://127.0.0.1:5007/api/kinic/*and forwards requests to the extension over the port, then returns{ success, message, data }. - Keep messages small; the extension does the heavy lifting (page capture, persistence, search).
- The extension opens a native port to
- Location:
native-host/ - Install (macOS):
cd native-host && python3 -m pip install -r requirements.txt && ./install_macos.sh
- Install (Windows):
cd native-host && pip install -r requirements.txt && powershell -ExecutionPolicy Bypass -File .\\install_windows.ps1
- Extension setup: Add
"permissions": ["nativeMessaging"]and a service worker that keeps a persistent connection. See examples innative-host/examples/. - HTTP API (local):
POST /api/kinic/storewith{ url?, title?, tags?, notes?, content?, selection?, metadata? }POST /api/kinic/retrievewith{ query, top_k?, filters? }
- Smoke test:
./native-host/smoke_test.sh
Step-by-step example:
- Load the example extension
chrome://extensions→ Developer Mode → Load unpacked → selectnative-host/example-extension(or unzipexample-extension.zip)- Copy the extension ID (32 characters)
- Install the native host manifest with your extension ID
- macOS:
cd native-host && DEV_ID=<your_id> PROD_ID=<your_id> ./install_macos.sh - Windows (PowerShell):
cd native-host && powershell -ExecutionPolicy Bypass -File .\\install_windows.ps1 -DevId <your_id> -ProdId <your_id>
- macOS:
- Open a web page in Chrome (e.g., https://kinic.io) so there’s an active tab
- Save the active tab
curl -s -X POST http://127.0.0.1:5007/api/kinic/store -H 'Content-Type: application/json' -d '{}'- Expect
{ "success": true, "message": "stored", "data": { ... } }
- Retrieve results
curl -s -X POST http://127.0.0.1:5007/api/kinic/retrieve -H 'Content-Type: application/json' -d '{"query":"test"}'- Expect
{ "success": true, "data": { "items": [...] } }
- Replace stubs in the example service worker with your real Kinic save/retrieve code.
Next steps:
- Load the example extension from
native-host/example-extension(or your own) viachrome://extensions. - Note the extension ID and re-run the installer with IDs to update host manifests.
- Verify the host at
GET http://127.0.0.1:5007/api/statusand trigger a test store/retrieve. - Replace the example service worker stubs with your real Kinic save/retrieve logic.
Here's how easy it is for agents to collaborate through Kinic:
# Agent 1: Researcher finds a great model
researcher = KinicAgent(api_key="your-kinic-key", role="researcher")
researcher.save("Found amazing sentiment model: cardiffnlp/twitter-roberta-base-sentiment")
# Agent 2: Builder (different session, different AI) needs a sentiment model
builder = KinicAgent(api_key="your-kinic-key", role="builder")
models = builder.search("sentiment analysis models")
# Returns: "cardiffnlp/twitter-roberta-base-sentiment" - exactly what Agent 1 found!
# Builder uses the discovered model
builder.implement(f"Create API using {models[0]}")That's it! No complex orchestration, no direct communication protocols. Just shared memory.
# Entire setup in 30 seconds
git clone https://github.com/hshadab/kinic-api.git
cd kinic-api
pip install -r requirements.txt # Just 4 packages!
python kinic-agent-api.py # You're live!- Any AI Model: OpenAI, Anthropic, Perplexity, HuggingFace, local models
- Any Language: Python client provided, REST API works with anything
- Any Platform: Web apps, CLI tools, notebooks, automation scripts
- Any Scale: From prototypes to production systems
- Visual dashboard shows agent connections in real-time
- Test connections with actual API calls
- Monitor memory growth and agent collaboration
- Debug with clear error messages and health checks
Create your free account at www.kinic.io and grab your API key from the dashboard.
git clone https://github.com/hshadab/kinic-api.git
cd kinic-api# Create virtual environment
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt# In your code
import os
os.environ['KINIC_API_KEY'] = 'your-api-key-from-kinic.io'# Start the agent backend
python kinic-agent-api.py
# Open the visualization in your browser
open index.htmlAll API requests require your Kinic API key from www.kinic.io.
# Initialize client
from kinic import KinicClient
client = KinicClient(api_key="your-key-from-kinic.io")
# Store memory
client.store_memory(
content="User prefers Python for data science tasks",
category="user_preferences",
tags=["python", "data_science", "preferences"]
)
# Query memories
results = client.query_memories(
query="What programming languages does the user know?",
limit=10
)
# Delete specific memory
client.delete_memory(memory_id="mem_123")Connect your favorite AI models to Kinic:
# Connect OpenAI with Kinic memory
from kinic import KinicAgent
agent = KinicAgent(
kinic_api_key="your-kinic-key",
openai_api_key="your-openai-key",
model="gpt-4"
)
# Agent automatically uses Kinic for memory
response = agent.chat("Remember that I prefer Python for ML tasks")
# This preference is now stored in Kinic
# Later, even in a new session
response = agent.chat("What's my preferred language for ML?")
# Returns: "You prefer Python for ML tasks"The included brain visualization showcases:
- 3D Neural Network - Your Kinic brain pulsing with activity
- Agent Constellation - AI agents orbiting your knowledge core
- Data Flow Particles - Watch memories being stored and retrieved
- Real-time Metrics - Monitor connections, storage, and activity
- Configuration Panel - Set up and test agent connections
- Command Interface - Direct control over your AI ecosystem
# Create specialized agents sharing one Kinic brain
researcher = KinicAgent(role="researcher", model="gpt-4")
analyst = KinicAgent(role="analyst", model="claude-3")
builder = KinicAgent(role="builder", model="gpt-4")
# Researcher discovers information
researcher.process("Find the latest ML optimization techniques")
# Analyst can access researcher's findings
analyst.process("Analyze the techniques found and rank by efficiency")
# Builder uses both agents' knowledge
builder.process("Implement the top-ranked optimization technique")# Store rich, contextual information
client.store_memory({
"content": "Customer prefers email communication on Tuesdays",
"embedding": generate_embedding(content),
"metadata": {
"customer_id": "cust_123",
"preference_type": "communication",
"confidence": 0.95
}
})
# Semantic similarity search
similar_memories = client.search_similar(
query_embedding=generate_embedding("communication preferences"),
threshold=0.8
)Build an AI that truly knows you - your preferences, history, and context persist forever.
Agents that remember every interaction, preference, and issue across all channels.
Multiple AI agents collaborating on complex problems, building on shared discoveries.
AI systems that maintain consistency in tone, style, and narrative across projects.
Tutoring systems that adapt to each student's learning pattern and progress.
- Documentation: Full API docs available after login at www.kinic.io
- Discord: Join our community for support and updates
- GitHub Issues: Report bugs or request features
- Enterprise: Contact us for dedicated support and custom solutions
- Sign up at www.kinic.io for your free account
- Get your API key from the dashboard
- Clone this repository
- Run the visualization demo
- Connect your first AI agent
- Store your first memory
- Build something amazing!
This repository is provided for use with the Kinic platform. See LICENSE for details.
Ready to give your AI perfect memory?
Built with ❤️ by the Kinic team
