Skip to content

How to use different whisper models from huggingface? #5025

Closed Answered by markuman
markuman asked this question in Q&A
Discussion options

You must be logged in to vote

Ok, I can answer this question by myself now.

  1. download the model to a local directory
from huggingface_hub import snapshot_download

model_id = "primeline/whisper-large-v3-german"  # Replace with the ID of the model you want to download
snapshot_download(repo_id=model_id, local_dir="whisper-large-v3-german")
  1. clone whisper and whisper.cpp

  2. Apply this patch to your whisper.cpp clone: ggerganov/whisper.cpp#2840 if it is not merged

  3. install python dependencies

pip install huggingface_hub  torch numpy transformers
  1. convert to ggml .bin file using
python whisper.cpp/models/convert-h5-to-ggml.py <model_path_from_1>/ <path_to_whisper_clone>/ output_directory/

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by markuman
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant