Skip to content

Fix failed tests on upstream-2026-10-02#58

Open
xrsrke wants to merge 1 commit intoupstream-2026-10-02from
phuc/fix_failed_tests_on_upstream-2026-10-02
Open

Fix failed tests on upstream-2026-10-02#58
xrsrke wants to merge 1 commit intoupstream-2026-10-02from
phuc/fix_failed_tests_on_upstream-2026-10-02

Conversation

@xrsrke
Copy link

@xrsrke xrsrke commented Mar 6, 2026

No description provided.

- Fix GptOssGroupedExperts.init_weights() signature: add missing n_layers
  param to match MoE.init_weights() call at moe/moe.py:1064
- Fix HF checkpoint integration test: make checkpoint_path dynamic based
  on output_dir instead of hardcoded artifacts-to-be-uploaded/ path
- Fix test_checkpoint.py: add process_group kwarg to fake_save methods
- Skip DeepSeek-V3 transformers tokenizer comparison: tokenizer.json (BPE)
  and tokenizer.model (SentencePiece) use different merge implementations
- Add debugmodel_moe_deepep flavor for convergence testing
@xrsrke xrsrke marked this pull request as ready for review March 6, 2026 22:46
@xrsrke xrsrke requested a review from jquesnelle March 6, 2026 22:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant