Skip to content

Augment AI agents with long-term memory through knowledge graph ๐Ÿง 

License

Notifications You must be signed in to change notification settings

shihanwan/memonto

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

26 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

memonto ๐Ÿง 

logo

memonto-pypi memonto-downloads memonto-license

memonto (memory + ontology) augments AI agents with long-term memory through a knowledge graph. The knowledge graph enables agents to remember past interactions, understand relationships between data, and improve contextual awareness.

  • Define the ontology for the information you want memonto to retain.
  • Extract that information from any unstructured text to a knowledge graph.
  • Query your knowledge graph for intelligent summaries or raw data for RAG.

explain

๐Ÿš€ Install

pip install memonto

โš™๏ธ Configure

Ephemeral Mode

Use memonto without any data stores.

Important

ephemeral mode is recommended for simpler/smaller use cases.

Define RDF ontology

from memonto import Memonto
from rdflib import Graph, Namespace, RDF, RDFS

g = Graph()

HIST = Namespace("history:")

g.bind("hist", HIST)

g.add((HIST.Person, RDF.type, RDFS.Class))
g.add((HIST.Event, RDF.type, RDFS.Class))
g.add((HIST.Place, RDF.type, RDFS.Class))

g.add((HIST.isFrom, RDF.type, RDF.Property))
g.add((HIST.isFrom, RDFS.domain, HIST.Person))
g.add((HIST.isFrom, RDFS.range, HIST.Place))

g.add((HIST.participatesIn, RDF.type, RDF.Property))
g.add((HIST.participatesIn, RDFS.domain, HIST.Person))
g.add((HIST.participatesIn, RDFS.range, HIST.Event))

Configure LLM

config = {
    "model": {
        "provider": "openai",
        "config": {
            "model": "gpt-4o",
            "api_key": "api-key",
        },
    }
}

Enable Ephemeral Mode

memonto = Memonto(
    ontology=g,
    namespaces={"hist": HIST},
    ephemeral=True,
)
memonto.configure(config)

Triple Store Mode

Enable triple store for persistent storage. To configure a triple store, add triple_store to the top level of your config dictionary.

Configure Triple Store

config = {
    "triple_store": {
        "provider": "apache_jena",
        "config": {
            "connection_url": "http://localhost:8080/dataset_name",
        },
    },
}

Install Apache Jena Fuseki

  1. Download Apache Jena Fuseki here.
  2. Unzip to desired folder.
tar -xzf apache-jena-fuseki-X.Y.Z.tar.gz
  1. Run a local server.
./fuseki-server --port=8080

Triple + Vector Stores Mode

Enable vector store for contextual retrieval. To configure a vector store, add vector_store to the top level of your config dictionary.

Important

You must enable triple store in conjunction with vector store.

Configure Local Vector Store

config = {
    "vector_store": {
        "provider": "chroma",
        "config": {
            "mode": "remote", 
            "path": ".local",
        },
    },
}

๐Ÿงฐ Usage

Retain

Exatract information from text that maps onto your ontology. It will only extract data that matches onto an entity in your ontology.

memonto.retain("Otto von Bismarck was a Prussian statesman and diplomat who oversaw the unification of Germany.")

Recall

Get a summary of the current memories. You can provide a context for memonto to only summarize the memories that are relevant to that context.

Important

When in ephemeral mode, all memories will be returned even if a context is provided.

# retrieve summary of memory relevant to a context
memonto.recall("Germany could unify under Prussia or Austria.")

# retrieve summary of all stored memory
memonto.recall()

Retrieve

Get raw knowledge graph data that can be programatically parsed or query for a summary that is relevant to a given context.

Important

When in ephemeral mode, raw queries are not supported.

# retrieve raw memory data by schema
memonto.retrieve(uri=HIST.Person)

# retrieve raw memory data by SPARQL query
memonto.retrieve(query="SELECT ?s ?p ?o WHERE {GRAPH ?g {?s ?p ?o .}}")

Forget

Forget about it.

memonto.forget()

RDF Namespaces

memonto supports RDF namespaces as well. Just pass in a dictionary with the namespace's name along with its rdflib.Namespace object.

memonto = Memonto(
    ontology=g,
    namespaces={"hist": HIST},
)

Auto Expand Ontology

Enable memonto to automatically expand your ontology to cover new data and relations. If memonto sees new information that does not fit onto your ontology, it will automatically add onto your ontology to cover that new information.

memonto = Memonto(
    id="some_id_123",
    ontology=g,
    namespaces={"hist": HIST},
    auto_expand=True,
)

๐Ÿ”€ Async Usage

All main functionalities have an async version following this function naming pattern: def a{func_name}:

async def main():
    await memonto.aretain("Some user query or message")
    await memonto.arecall()
    await memonto.aretrieve(uri=HIST.Person)
    await memonto.aforget()

๐Ÿ”ฎ Current and Upcoming Support

LLM Vector Store Triple Store
OpenAI โœ… Chroma โœ… Apache Jena โœ…
Anthropic โœ… Pinecone ๐Ÿ”œ
Meta llama ๐Ÿ”œ Weaviate ๐Ÿ”œ

Feedback on what to support next is always welcomed!

๐Ÿ’ฏ Requirements

Python 3.7 or higher.