-
Notifications
You must be signed in to change notification settings - Fork 717
gemma-4-31b-bf16 not working with oMLX 0.3.2 #555
Description
Describe the bug
After loaded the model gemma-4-31b-bf16(mlx)

I got error response after sent message in chat page.
Error: {"error":{"message":"Chat template error: Cannot use chat template functions because tokenizer.chat_template is not set and no template argument was passed! For information about writing templates and setting the tokenizer.chat_template attribute, please see the documentation at https://huggingface.co/docs/transformers/main/en/chat_templating","type":"invalid_request_error","param":null,"code":null}}
To Reproduce
Steps to reproduce the behavior:
- Download the mlx-community/gemma-4-31b-bf16 model from HF
- Loaded the model
- Open the chat page and input message.
- See error
Expected behavior
I can get message response without error.
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
- OS: macOS
- Browser: Chrome(146.0.7680.165)
- Version 26.4
Additional context
Add any other context about the problem here.