Skip to content

Enhancement: Add ability to pass local cache to chat models #17242

Closed as not planned
@eyurtsev

Description

@eyurtsev

Feature request discussed in #17176

Expand cache to accept a cache implementation in addition to a bool value:

If provided, will use the given cache.

Acceptance Criteria

PR can include implementation for caching of LLMs in addition to chat models.

Metadata

Metadata

Assignees

No one assigned

    Labels

    good first issueGood for newcomershelp wantedGood issue for contributorsⱭ: modelsRelated to LLMs or chat model modules🔌: redisPrimarily related to Redis integrations🤖:enhancementA large net-new component, integration, or chain. Use sparingly. The largest features

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions