memonto
(memory + ontology) augments AI agents with long-term memory through a knowledge graph. The knowledge graph enables agents to remember past interactions, understand relationships between data, and improve contextual awareness.
- Define the ontology for the information you want memonto to retain.
- Extract that information from any unstructured text to a knowledge graph.
- Query your knowledge graph for intelligent summaries or raw data for RAG.
pip install memonto
Use memonto
without any data stores.
Important
ephemeral
mode is recommended for simpler/smaller use cases.
Define RDF ontology
from memonto import Memonto
from rdflib import Graph, Namespace, RDF, RDFS
g = Graph()
HIST = Namespace("history:")
g.bind("hist", HIST)
g.add((HIST.Person, RDF.type, RDFS.Class))
g.add((HIST.Event, RDF.type, RDFS.Class))
g.add((HIST.Place, RDF.type, RDFS.Class))
g.add((HIST.isFrom, RDF.type, RDF.Property))
g.add((HIST.isFrom, RDFS.domain, HIST.Person))
g.add((HIST.isFrom, RDFS.range, HIST.Place))
g.add((HIST.participatesIn, RDF.type, RDF.Property))
g.add((HIST.participatesIn, RDFS.domain, HIST.Person))
g.add((HIST.participatesIn, RDFS.range, HIST.Event))
Configure LLM
config = {
"model": {
"provider": "openai",
"config": {
"model": "gpt-4o",
"api_key": "api-key",
},
}
}
Enable Ephemeral Mode
memonto = Memonto(
ontology=g,
namespaces={"hist": HIST},
ephemeral=True,
)
memonto.configure(config)
Enable triple store for persistent storage. To configure a triple store, add triple_store
to the top level of your config
dictionary.
Configure Triple Store
config = {
"triple_store": {
"provider": "apache_jena",
"config": {
"connection_url": "http://localhost:8080/dataset_name",
},
},
}
Install Apache Jena Fuseki
- Download Apache Jena Fuseki here.
- Unzip to desired folder.
tar -xzf apache-jena-fuseki-X.Y.Z.tar.gz
- Run a local server.
./fuseki-server --port=8080
Enable vector store for contextual retrieval. To configure a vector store, add vector_store
to the top level of your config
dictionary.
Important
You must enable triple store in conjunction with vector store.
Configure Local Vector Store
config = {
"vector_store": {
"provider": "chroma",
"config": {
"mode": "remote",
"path": ".local",
},
},
}
Exatract information from text that maps onto your ontology. It will only extract data that matches onto an entity in your ontology.
memonto.retain("Otto von Bismarck was a Prussian statesman and diplomat who oversaw the unification of Germany.")
Get a summary of the current memories. You can provide a context
for memonto
to only summarize the memories that are relevant to that context
.
Important
When in ephemeral
mode, all memories will be returned even if a context
is provided.
# retrieve summary of memory relevant to a context
memonto.recall("Germany could unify under Prussia or Austria.")
# retrieve summary of all stored memory
memonto.recall()
Get raw knowledge graph data that can be programatically parsed or query for a summary that is relevant to a given context.
Important
When in ephemeral
mode, raw queries are not supported.
# retrieve raw memory data by schema
memonto.retrieve(uri=HIST.Person)
# retrieve raw memory data by SPARQL query
memonto.retrieve(query="SELECT ?s ?p ?o WHERE {GRAPH ?g {?s ?p ?o .}}")
Forget about it.
memonto.forget()
memonto
supports RDF namespaces as well. Just pass in a dictionary with the namespace's name along with its rdflib.Namespace
object.
memonto = Memonto(
ontology=g,
namespaces={"hist": HIST},
)
Enable memonto
to automatically expand your ontology to cover new data and relations. If memonto
sees new information that does not fit onto your ontology, it will automatically add onto your ontology to cover that new information.
memonto = Memonto(
id="some_id_123",
ontology=g,
namespaces={"hist": HIST},
auto_expand=True,
)
All main functionalities have an async version following this function naming pattern: def a{func_name}:
async def main():
await memonto.aretain("Some user query or message")
await memonto.arecall()
await memonto.aretrieve(uri=HIST.Person)
await memonto.aforget()
LLM | Vector Store | Triple Store | |||
---|---|---|---|---|---|
OpenAI | โ | Chroma | โ | Apache Jena | โ |
Anthropic | โ | Pinecone | ๐ | ||
Meta llama | ๐ | Weaviate | ๐ |
Feedback on what to support next is always welcomed!
Python 3.7 or higher.