You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there, I built from source with the following line: make BUILD_GRPC_FOR_BACKEND_LLAMA=true BUILD_TYPE=cublas build
and after I can see the following grpc backends are present in /LocalAI/backend-assets/grpc:
-rwxr-xr-x 1 ai ai 3918808 Sep 27 06:27 bert-embeddings
-rwxr-xr-x 1 ai ai 4163096 Sep 27 06:27 huggingface
-rw------- 1 ai ai 256 Sep 26 17:00 huggingface.upx
-rwxr-xr-x 1 ai ai 364528280 Sep 27 06:27 llama-cpp-avx
-rwxr-xr-x 1 ai ai 364528280 Sep 27 06:28 llama-cpp-avx2
-rwxr-xr-x 1 ai ai 364528280 Sep 27 06:28 llama-cpp-fallback
-rwxr-xr-x 1 ai ai 364580664 Sep 27 06:29 llama-cpp-grpc
-rwxr-xr-x 1 ai ai 5602432 Sep 27 06:28 llama-ggml
-rwxr-xr-x 1 ai ai 3702652 Sep 27 06:29 local-store
-rwxr-xr-x 1 ai ai 3800140 Sep 27 06:29 rwkv
-rwxr-xr-x 1 ai ai 12927532 Sep 27 06:29 whisper
I tried installing diffusers via huggingface instructions, but this only adds diffusers-cli in my virtual python environment.
I'd like to run FLUX on a self-compiled install
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi there, I built from source with the following line:
make BUILD_GRPC_FOR_BACKEND_LLAMA=true BUILD_TYPE=cublas build
and after I can see the following grpc backends are present in
/LocalAI/backend-assets/grpc
:I tried installing diffusers via huggingface instructions, but this only adds diffusers-cli in my virtual python environment.
I'd like to run FLUX on a self-compiled install
Beta Was this translation helpful? Give feedback.
All reactions