Skip to content

gemma-4-31b-bf16 not working with oMLX 0.3.2 #555

@liuxp2003

Description

@liuxp2003

Describe the bug
After loaded the model gemma-4-31b-bf16(mlx)
Image

I got error response after sent message in chat page.
Error: {"error":{"message":"Chat template error: Cannot use chat template functions because tokenizer.chat_template is not set and no template argument was passed! For information about writing templates and setting the tokenizer.chat_template attribute, please see the documentation at https://huggingface.co/docs/transformers/main/en/chat_templating","type":"invalid_request_error","param":null,"code":null}}

To Reproduce
Steps to reproduce the behavior:

  1. Download the mlx-community/gemma-4-31b-bf16 model from HF
  2. Loaded the model
  3. Open the chat page and input message.
  4. See error

Expected behavior
I can get message response without error.

Screenshots
If applicable, add screenshots to help explain your problem.

Image

Desktop (please complete the following information):

  • OS: macOS
  • Browser: Chrome(146.0.7680.165)
  • Version 26.4

Additional context
Add any other context about the problem here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    in progresscurrently being worked on

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions