Skip to content

Add LRU cache service #142

Open
Open
@hinerm

Description

@hinerm

in #141 we stated the need for caching but not the method by which we would cache.

In the long term we should have a generalized LRU cache that we can use to enforce memory constraints but allow caching of items that end up being used frequently.

API considerations

  • I would prefer that consumers do not have to manually release references when complete, but it would be nice if the cache could ensure certain items would be held in memory for a period of time. Maybe this is impossible though.. ?

Potential design:
give the cache an object and the way to re-generate the object when needed... cache stores both, returns an object that then accesses the object and knows how to re-create it if needed. This seems maybe too dangerous though..?

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions