-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Description
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- [X ] I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- [X ] I carefully followed the README.md.
- [ X] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- [ X] I reviewed the Discussions, and have a new bug or useful enhancement to share.
Expected Behavior
I'm trying to launch the Llama API following the guide in https://github.com/abetlen/llama-cpp-python#:~:text=docs/install/macos.md, on my Macbook M2.
Current Behavior
Unfortunately, after running the last command, which is
python3 -m llama_cpp.server --model $MODEL --n_gpu_layers 1
I get
RuntimeError: Failed to load shared library '/Users/alessandropaticchio/miniforge3/envs/llama/lib/python3.9/site-packages/llama_cpp/libllama.dylib': dlopen(/Users/alessandropaticchio/miniforge3/envs/llama/lib/python3.9/site-packages/llama_cpp/libllama.dylib, 0x0006): tried: '/Users/alessandropaticchio/miniforge3/envs/llama/lib/python3.9/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Users/alessandropaticchio/miniforge3/envs/llama/lib/python3.9/site-packages/llama_cpp/libllama.dylib' (no such file), '/Users/alessandropaticchio/miniforge3/envs/llama/lib/python3.9/site-packages/llama_cpp/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))
Any help would be much appreciated!