You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am trying to run inference on moc os monterey 12.7.6 using the default ggml-small.bin model downloaded using the download-ggml-model.sh script. Here are the steps to reproduce the issue -
clone whisper.cpp github repo
install sdl2 using brew install sdl2
build the cloned repo using make -j in the root folder
download the default whisper small model using bash ./download-ggml-model.sh small
transcribe the given sample wav file using ./main -m models/ggml-small.bin -f samples/jfk.wav
Running the exact same steps produces the expected transcription output in my ubuntu 22.04 machine. There, the transcription appears with a to-and-fro timestamp below the line main: processing 'samples/jfk.wav' (176000 samples, 11.0 sec), 4 threads, 1 processors, 5 beams + best of 5, lang = en, task = transcribe, timestamps = 1 .... However in my mac, the transcription output is not shown.
Any help to fix this would be much appreciated!
The text was updated successfully, but these errors were encountered:
Hi, I am trying to run inference on moc os monterey 12.7.6 using the default ggml-small.bin model downloaded using the download-ggml-model.sh script. Here are the steps to reproduce the issue -
brew install sdl2
make -j
in the root folderbash ./download-ggml-model.sh small
./main -m models/ggml-small.bin -f samples/jfk.wav
Here is the output I receive -
Running the exact same steps produces the expected transcription output in my ubuntu 22.04 machine. There, the transcription appears with a to-and-fro timestamp below the line
main: processing 'samples/jfk.wav' (176000 samples, 11.0 sec), 4 threads, 1 processors, 5 beams + best of 5, lang = en, task = transcribe, timestamps = 1 ...
. However in my mac, the transcription output is not shown.Any help to fix this would be much appreciated!
The text was updated successfully, but these errors were encountered: