-
Notifications
You must be signed in to change notification settings - Fork 286
Open
Description
Requirements for supporting Qwen3.5
vLLM requirements
transformers==5.3.0release (needs to include this PR: Fix TypeError in convert_rope_params_to_dict when ignore_keys is a list huggingface/transformers#44272)vllm==0.16.1release for official model release on vLLM (qwen3_5.py)- a pre-built linux wheel for flash-attn==2.8.3 + torch 2.10.0 (https://github.com/Dao-AILab/flash-attention/releases/tag/v2.8.3 does not include a torch 2.10 wheel, and vllm==0.16.1 requires torch==2.10.0)
- merge the changes to
apply_chat_templatein this PR: Bump transformers to >=5.0.0 for GLM-4.7-Flash #1241
Megatron requirements:
- megatron-bridge bump commit to main latest:
0034ddaad7fae7c658c3df7e12d13522a4935770
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels