Fix TransformerConfig validation for mixed dense/MoE upcycling#3647
Open
rkteddy wants to merge 2 commits intoNVIDIA:mainfrom
Open
Fix TransformerConfig validation for mixed dense/MoE upcycling#3647rkteddy wants to merge 2 commits intoNVIDIA:mainfrom
rkteddy wants to merge 2 commits intoNVIDIA:mainfrom
Conversation
asolergi-nv
reviewed
Mar 2, 2026
| assert ( | ||
| self.moe_ffn_hidden_size is None | ||
| ), "moe_ffn_hidden_size must be None when num_experts is not set." | ||
| if self.num_moe_experts is None and self.moe_ffn_hidden_size is not None: |
Contributor
There was a problem hiding this comment.
Can we make this check more robust at least by checking those flags + explicitly warning in which scenario we expect to see the warning? Thanks!
Author
There was a problem hiding this comment.
Thanks for the suggestion! Updated the check to also inspect moe_layer_freq:
- If moe_layer_freq indicates a mixed dense/MoE model, warn and reset moe_ffn_hidden_size, this is the expected scenario where dense layers inherit the global MoE config.
- Otherwise, raise a ValueError when having moe_ffn_hidden_size set without num_moe_experts is a misconfiguration.
…e ValueError for invalid config
65aac00 to
70ad756
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What does this PR do ?
Fix
TransformerConfigvalidation for mixed dense/MoE upcycling (--moe-layer-freq+--moe-use-upcycling):In a mixed model, dense layer configs inherit the global
moe_ffn_hidden_sizewhile havingnum_moe_experts=None. The existing assert inTransformerConfig.__post_init__raises an error in this case. Replace the assert with a warning and setmoe_ffn_hidden_size = Nonefor dense layers.Related: #3646 fixes the upcycling state dict conversion for the same scenario.
Contribution process
flowchart LR A[Pre-checks] --> B[PR Tests] subgraph Code Review/Approval C1[Expert Review] --> C2[Final Review] end B --> C1 C2 --> D[Merge]Pre-checks
Core 0.8)Code review
The following process is enforced via the CODEOWNERS file for changes into
megatron/core. For changes outside ofmegatron/core, it is up to the PR author whether or not to tag the Final Reviewer team.For MRs into `main` branch
Feel free to message or comment the @mcore-oncall to help accelerate your merge into main. The less complex your PR is, the faster it will be approved and merged!
(Step 1): Add PR label
Expert Review(Step 2): Collect the expert reviewers reviews
Expert Reviewlabel when your PR is ready for review.Final Review might get declined if these requirements are not fulfilled.
(Step 3): Final Review
Final Reviewlabel(Optional Step 4): Cherry-pick into release branch
If this PR also needs to be merged into
core_r*release branches, after this PR has been merged, selectCherry-pickto open a new PR into the release branch.For MRs into `dev` branch
The proposed review process for `dev` branch is under active discussion.MRs are mergable after one approval by either
eharper@nvidia.comorzijiey@nvidia.com.Merging your PR
Any member of core-adlr and
core-nemowill be able to merge your PR.