-
I like to test this one https://huggingface.co/primeline/whisper-tiny-german-1224 But I've no idea how to load and use it using local-ai.
Any ideas? |
Beta Was this translation helpful? Give feedback.
Answered by
markuman
Mar 17, 2025
Replies: 1 comment
-
Ok, I can answer this question by myself now.
from huggingface_hub import snapshot_download
model_id = "primeline/whisper-large-v3-german" # Replace with the ID of the model you want to download
snapshot_download(repo_id=model_id, local_dir="whisper-large-v3-german")
|
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
markuman
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Ok, I can answer this question by myself now.
clone whisper and whisper.cpp
Apply this patch to your whisper.cpp clone: ggerganov/whisper.cpp#2840 if it is not merged
install python dependencies