MeetKai Functionary GGUF models - chat template format unknown #1641
Replies: 3 comments 5 replies
-
anybody have an idea about this? thanks |
Beta Was this translation helpful? Give feedback.
-
Sources
{% set loop_messages = messages %}
{% for message in loop_messages %}
{% set content = '' + message['role'] + '\n\n'+ message['content'] | trim + '' %}
{% if loop.index0 == 0 %}
{% set content = bos_token + content %}
{% endif %}
{{ content }}
{% endfor %}
{% if add_generation_prompt %}
{{ 'assistant\n\n' }}
{% endif %} What I got so farroles:
user: "user"
assistant: "assistant"
system: "system"
tool: "tool"
template:
chat_message: |
<|begin_of_text|>{{.RoleName}}
{{if .Content}}{{.Content}}{{end}}<|end_of_text|>
chat: |
<|begin_of_text|>{{.RoleName}}
{{if .Content}}{{.Content}}{{end}}assistant
<|end_of_text|>
completion: |
{{.Input}} IMHO, it may be better to just run them and use the code ( |
Beta Was this translation helpful? Give feedback.
-
thanks @Zen3515 I used the older MeetKai functionary with that integration, and it did do the functions well. At the moment im using the Llama3 8b (q8) with the HomeLLM integration and seems to working well, mostly. Have you tried this model mediumV2.2 with HA? |
Beta Was this translation helpful? Give feedback.
-
LocalAI version:
quay.io/go-skynet/local-ai:master-cublas-cuda12-ffmpeg
Using the Jan 21,2024 build
Environment, CPU architecture, OS, and Version:
Docker Ubuntu, AMD, 4060 dual GPU
I cannot run the gguf model using standard chat templates
When use the standard chat and chat-block tmpls as shown here, this error results for a querry
The devs over at MeetKai say here that this model uses a jinja template in tokenizer_config.json:
https://huggingface.co/meetkai/functionary-medium-v2.2/blob/main/tokenizer_config.json#L69
and it would have to be converted to the format of LocalAI
any suggestions how to do this conversion? thanks
Beta Was this translation helpful? Give feedback.
All reactions