how to disable f16memory and keep the memory within the llama.cpp settings? i'm oom ed with go-llama as below #237
Unanswered
hiqsociety
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
@mudler
llama.cpp
go-llama.cpp
Beta Was this translation helpful? Give feedback.
All reactions