We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
I have tried whisper.cpp on my iPhone and it runs very fast , so I wonder if it is possible that llama.cpp could support it. thank you .