Skip to content

run locally #9

@ShasTheMass

Description

@ShasTheMass

Hello, great work here. But I wonder if we could make this run completely locally? e.g. with an Ollama based model? has anyone tried this? are models good enough (the small ones that fit on a, say 16GB mem, PC/MAC) to understand screenshots?

Hope to hear back from you @deedy

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions