Universal memory layer for AI Agents
-
Updated
Nov 22, 2025 - Python
Universal memory layer for AI Agents
Open-Source Memory Engine for LLMs, AI Agents & Multi-Agent Systems
Build memory-native AI agents with Memory OS — an open-source framework for long-term memory, retrieval, and adaptive learning in large language models. Agent Memory | Memory System | Memory Management | Memory MCP | MCP System | LLM Memory | Agents Memory System |
Prevent PyTorch's `CUDA error: out of memory` in just 1 line of code.
Bringing the hell of pointers to Python.
Redis memory profiler to find the RAM bottlenecks throw scaning key space in real time and aggregate RAM usage statistic by patterns.
An Innovative Agent Framework Driven by KG Engine
Universal memory layer for AI Agents. It provides scalable, extensible, and interoperable memory storage and retrieval to streamline AI agent state management for next-generation autonomous systems.
An AI memory layer with short- and long-term storage, semantic clustering, and optional memory decay for context-aware applications.
ReMe: Memory Management Kit for Agents - Remember Me, Refine Me.
LightMem: Lightweight and Efficient Memory-Augmented Generation
Experimental ground for optimizing memory of pytorch models
A list of AI memory projects
Turn AI into a persistent, memory-powered collaborator. Universal MCP Server enabling cross-platform AI memory, multi-agent coordination, and context sharing. Built with MARM protocol for structured reasoning that evolves with your work.
Open-source calculator for LLM system requirements.
ML model training for edge devices
Poireau: a sampling allocation debugger
Android Memory Tools written in python for RAM data reading and writing process of android, linux and windows os's.
Reproducible Structured Memory for LLMs
Add a description, image, and links to the memory-management topic page so that developers can more easily learn about it.
To associate your repository with the memory-management topic, visit your repo's landing page and select "manage topics."