Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in converting llama #29

Open
purejomo opened this issue Aug 9, 2024 · 0 comments
Open

Error in converting llama #29

purejomo opened this issue Aug 9, 2024 · 0 comments

Comments

@purejomo
Copy link

purejomo commented Aug 9, 2024

I'm trying to convert my llama2 7b model following Readme

In STEP1,
I set my command like below,
python3 /home/ghlee/transformers/models/llama/convert_llama_weights_to_hf.py --input_dir /home/ghlee/llama/llama-2-7b --model_size 7B --output_dir /home/ghlee/llama.onnx/onnx_converted
and this kind of error occured

Traceback (most recent call last):
File "/home/ghlee/transformers/models/llama/convert_llama_weights_to_hf.py", line 407, in
main()
File "/home/ghlee/transformers/models/llama/convert_llama_weights_to_hf.py", line 396, in main
write_model(
File "/home/ghlee/transformers/models/llama/convert_llama_weights_to_hf.py", line 178, in write_model
f"model.layers.{layer_i}.self_attn.k_proj.weight": permute(
File "/home/ghlee/transformers/models/llama/convert_llama_weights_to_hf.py", line 154, in permute
return w.view(n_heads, dim1 // n_heads // 2, 2, dim2).transpose(1, 2).reshape(dim1, dim2)
RuntimeError: shape '[32, 2, 2, 4096]' is invalid for input of size 16777216

it seems like shape are not matched
but I never had modified the params.jason and config.jason

Is there any way to fix ti?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant