A powerful integration between Cognee and LangGraph that provides intelligent knowledge management and retrieval capabilities for AI agents.
cognee-integration-langgraph
combines Cognee's advanced knowledge storage and retrieval system with LangGraph's workflow orchestration capabilities. This integration allows you to build AI agents that can efficiently store, search, and retrieve information from a persistent knowledge base.
- Smart Knowledge Storage: Add and persist information using Cognee's advanced indexing
- Semantic Search: Retrieve relevant information using natural language queries
- Session Management: Support for user-specific data isolation
- LangGraph Integration: Seamless integration with LangGraph's agent framework
- Async Support: Built with async/await for high-performance applications
# Using uv
uv add cognee-integration-langgraph
# Using pip
pip install cognee-integration-langgraph
Returns sessionized cognee tools for isolated data management.
Returns: (add_tool, search_tool)
- A tuple of tools for storing and searching data
add_tool
: Store information in the knowledge basesearch_tool
: Search and retrieve previously stored information
cognee-integration-langgraph
supports user-specific sessions to isolate data between different users or contexts:
from cognee_integration_langgraph import get_sessionized_cognee_tools
user1_tools = get_sessionized_cognee_tools("user-123")
user2_tools = get_sessionized_cognee_tools("user-456")
Copy the .env.template
file to .env
and fill out the required API keys:
cp .env.template .env
Then edit the .env
file and set both keys using your OpenAI API key:
OPENAI_API_KEY=your-openai-api-key-here
LLM_API_KEY=your-openai-api-key-here
Check out the examples/
directory for more comprehensive usage examples:
examples/example.py
: Complete workflow with contract managementexamples/guide.ipynb
: Jupyter notebook tutorial with step-by-step guidance
- Python 3.12+
- OpenAI API key
- Dependencies automatically managed via pyproject.toml