[Bug]: Llama chat template cannot process tool_calls=[] in previous messages #13978
Open
1 task done
Labels
bug
Something isn't working
Your current environment
The output of `python collect_env.py`
🐛 Describe the bug
The chat template for Llama 3.1 and 3.2 asserts that
len(tool_calls) == 1
for each message with attributetool_calls
in the request. Iftool_calls: []
is provided in an assistant message as part of the request, the chat template throws and vLLM rejects the request.To reproduce:
Throws an exception in chat template application. Here's the snippet of the chat template responsible for this problem:
We should either patch the chat templates, or strip "tool_calls" from the response messages when the list is empty.
Error log:
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: