feat: upgrade MiniMax default model to M2.7#347
Conversation
- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model list - Set MiniMax-M2.7 as default model in install scripts - Keep all previous models (M2.5) as alternatives - Update context window mappings, known models, and docs
|
PR Bot seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account. You have signed the CLA already but the status is still pending? Let us recheck it. |
📊 CI Metrics Report
Summary
By Role
Per-Test Breakdown
Generated by HiClaw CI on 2026-03-18 10:35:07 UTC |
johnlanni
left a comment
There was a problem hiding this comment.
百炼coding plan还没有m2.7,可以先在known models里加上
…ls only Bailian coding plan does not support M2.7 yet, so revert the coding plan model selector back to M2.5 while keeping M2.7 in the known models list for manual model switching.
|
Thanks for the review @johnlanni! Updated: reverted the coding plan model selector back to MiniMax-M2.5 (since Bailian coding plan does not support M2.7 yet), while keeping MiniMax-M2.7 and MiniMax-M2.7-highspeed in the known models list so they can be used via manual model switching. Please take another look. |
Summary
Changes
Why
MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities.
Testing