Bug: llama-server + LLava 1.6 hallucinates #8001
Labels
bug-unconfirmed
medium severity
Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
stale
What happened?
When using
./llama-llava-cli
, I get perfectly fine descriptions of images. But when hosting LLava with./llama-server
, LLava hallucinates big time.Here's how I'm running LLava with the cli:
./llama-llava-cli -m models/llava-v1.6-vicuna-7b.Q5_K_S.gguf --mmproj models/mmproj-model-f16.gguf --image images/sth.jpeg -c 4096
Here's how I'm starting the server:
./llama-server -m models/llava-v1.6-vicuna-7b.Q5_K_S.gguf --mmproj models/mmproj-model-f16.gguf -c 2048 --host 127.0.0.1 --port 8000
Here's the python code to send the request:
Name and Version
What operating system are you seeing the problem on?
Mac
Relevant log output
No response
The text was updated successfully, but these errors were encountered: