Skip to content

Thread Safety in llama.cpp #596

Open
@martindevans

Description

@martindevans

Tracking issue for thread safety in llama.cpp. The global inference lock can be removed once this is resolved.

ggml-org/llama.cpp#3960

Metadata

Metadata

Assignees

No one assigned

    Labels

    UpstreamTracking an issue in llama.cppdo not closeProtect this issue from auto closing

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions