Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Eval bug: A Silu operand overflow occurred , causing the program to malfunction. #12523

Closed
Yangxiaoz opened this issue Mar 23, 2025 · 0 comments

Comments

@Yangxiaoz
Copy link

Yangxiaoz commented Mar 23, 2025

Name and Version

./llama-cli --version
register_backend: registered backend CPU (1 devices)
register_device: registered device CPU (13th Gen Intel(R) Core(TM) i9-13900H)
load_backend: failed to find ggml_backend_init in ~/workspace/github/llama.cpp/build/bin/libggml-cpu.so
version: 4942 (fbdfefe)
built with cc (Ubuntu 12.3.0-1ubuntu1~22.04) 12.3.0 for x86_64-linux-gnu

Operating systems

Linux

GGML backends

CPU

Hardware

13th Gen Intel(R) Core(TM) i9-13900H

Models

DeepSeek-Coder-V2-Lite-Instruct-Q4_K_M.gguf

Problem description & steps to reproduce

when i used "cmake -B build -DCMAKE_BUILD_TYPE=Debug" to build debug-mode and run the deepseekv2-lite model, It doesn't work properly anymore.
After some basic debugging, I found that during the traversal of the computation graph, specifically while applying the SiLU operation to the ffn-moe-gate (executing the node ffn_moe_silu-1), a numerical overflow occurred in the operand x, causing the program to crash.
This issue appears unrelated to debug mode, as switching to release mode allowed the program to run, but the output was incorrect:”<|begin▁of▁sentence|>Hello my name is!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!“

First Bad Commit

According to my preliminary judgment, this problem should be caused by commit:#12181

Relevant log output

llama-simple: \~/workspace/github/llama.cpp/ggml/src/ggml-cpu/ggml-cpu.c:7670: ggml_compute_forward_silu_f32: Assertion `!isnan(x)' failed.
llama-simple: \~/workspace/github/llama.cpp/ggml/src/ggml-cpu/ggml-cpu.c:7670: ggml_compute_forward_silu_f32: Assertion `!isnan(x)' failed.
llama-simple: \~/workspace/github/llama.cpp/ggml/src/ggml-cpu/ggml-cpu.c:7671: ggml_compute_forward_silu_f32: Assertion `!isinf(x)' failed.
Aborted (core dumped)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant