Skip to content

Android support #325

Closed
Closed
@sasskialudin

Description

@sasskialudin

I know that you have closed a previously opened issue about that topic on December 14th due to lack of answer from the original poster but meanwhile more potent smartphones are making their headway, so the issue is very relevant again ;-)

Here is h a copy of the message I left on your closed thread.


Fantastic project, kudos!

I'm also interested in running a choice of models from an Android smartphone.

I'm a developer and my use case involves an app driving the phone by voice (prompts will be acquired by a speech to text fronted) triggering agents from a bound LLM (supporting function calls).

I just bought a state of the art phone just to make this possible, it's a OnePlus 12 featuring a Snapdragon 8 gen 3 SOC with 24Gb RAM (available only from China!) and 1tb storage. It should be enough to execute quantized 13B models .

My app will be created with Flutter and should interoperate with the picked llamafile model via Dart FFI.

Is that possible?

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions