Skip to content

[CI] Migrate MoE Tests to OlMoE#1384

Draft
devpatelio wants to merge 3 commits intomainfrom
devpatel/gpuci_change_moe
Draft

[CI] Migrate MoE Tests to OlMoE#1384
devpatelio wants to merge 3 commits intomainfrom
devpatel/gpuci_change_moe

Conversation

@devpatelio
Copy link
Copy Markdown
Collaborator

@devpatelio devpatelio commented Mar 25, 2026

Tentatively migrated all CI except 5 tests in test_megatron_worker.py

  • Test_megatron_forward[tp4_pp1_cp1_ep4_etp1_policy_seq_packing]
  • Test_megatron_lora_forward[tp4_pp1_cp1_ep4_etp1_policy]
  • test_megatron_train[tp4_pp1_cp1_ep4_etp1_policy_seq_packing]
  • Test_megatron_train[tp4_pp1_cp1_ep4_etp1_policy_seq_packing_lora]
  • Test_megatron_offload_memory_and_correctness[policy]

The Megatron Bridge does not support TP > 1 for EP > 1 for the q/k layernorm so this will require a patch. For now, OlMoE should reduce CI resources for majority of the tests.

See https://github.com/NVIDIA-NeMo/Megatron-Bridge/blob/ffe05be084aaee1a2bccd005737329de9284ce91/src/megatron/bridge/models/olmoe/olmoe_provider.py#L137 for reference

@erictang000
Copy link
Copy Markdown
Collaborator

The Megatron Bridge does not support TP > 1 for EP > 1 for the q/k layernorm so this will require a patch. For now, OlMoE should reduce CI resources for majority of the tests.

oh hmm ok i didn't realize this... tp + ep seems like a prepreq for any of these tests... let's figure out another workaround for router replay tests, and we can keep megatron_worker tests just using fewer hidden layers

@erictang000
Copy link
Copy Markdown
Collaborator

we could consider fixing any issue upstream in megatron-bridge, or implementing our own bridge for a small model like the small arcee moe models

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants