Skip to content

feat: upgrade MiniMax default model to M2.7#347

Merged
johnlanni merged 2 commits intoalibaba:mainfrom
octo-patch:feature/upgrade-minimax-m27
Mar 21, 2026
Merged

feat: upgrade MiniMax default model to M2.7#347
johnlanni merged 2 commits intoalibaba:mainfrom
octo-patch:feature/upgrade-minimax-m27

Conversation

@octo-patch
Copy link
Contributor

Summary

  • Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to the model selection list
  • Set MiniMax-M2.7 as the new default model when selecting MiniMax
  • Retain MiniMax-M2.5 as an available alternative

Changes

  • Model configs: Added M2.7 and M2.7-highspeed entries to known-models.json, manager-openclaw.json.tmpl, and worker-openclaw.json.tmpl (both models list and aliases)
  • Shell scripts: Updated context window case statements in 4 scripts to recognize M2.7 and M2.7-highspeed
  • Install scripts: Updated both bash and PowerShell installers — model display text, KNOWN_MODELS list, and default selection
  • Docs: Updated model references in windows-deploy.md (EN + ZH) and both SKILL.md files

Why

MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities.

Testing

  • All JSON configs validated
  • No existing MiniMax unit tests to update (system uses integration tests)
  • Changes are config/data-only — no logic changes

- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model list
- Set MiniMax-M2.7 as default model in install scripts
- Keep all previous models (M2.5) as alternatives
- Update context window mappings, known models, and docs
@CLAassistant
Copy link

CLAassistant commented Mar 18, 2026

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 2 committers have signed the CLA.

✅ octo-patch
❌ PR Bot


PR Bot seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

@github-actions
Copy link
Contributor

github-actions bot commented Mar 18, 2026

📊 CI Metrics Report

ℹ️ No baseline available - This is the first run or baseline data was not found.

Summary

Metric Value
LLM Calls 233
Input Tokens 6100787
Output Tokens 49208

By Role

Role LLM Calls Input Tokens Output Tokens
🧠 Manager 162 5109646 35272
🔧 Workers 71 991141 13936

Per-Test Breakdown

Test Mgr Calls Wkr Calls Mgr In Wkr In Mgr Out Wkr Out
02-create-worker 33 0 735685 0 5957 0
03-assign-task 15 13 380814 197521 2455 2462
04-human-intervene 20 9 506574 142060 3201 1220
05-heartbeat 8 1 270502 0 1782 0
06-multi-worker 36 23 1349222 284204 9249 3750
14-git-collab 50 25 1866849 367356 12628 6504

Generated by HiClaw CI on 2026-03-18 10:35:07 UTC

Copy link
Collaborator

@johnlanni johnlanni left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

百炼coding plan还没有m2.7,可以先在known models里加上

…ls only

Bailian coding plan does not support M2.7 yet, so revert the coding
plan model selector back to M2.5 while keeping M2.7 in the known
models list for manual model switching.
@octo-patch
Copy link
Contributor Author

Thanks for the review @johnlanni! Updated: reverted the coding plan model selector back to MiniMax-M2.5 (since Bailian coding plan does not support M2.7 yet), while keeping MiniMax-M2.7 and MiniMax-M2.7-highspeed in the known models list so they can be used via manual model switching. Please take another look.

Copy link
Collaborator

@johnlanni johnlanni left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@johnlanni johnlanni merged commit f058051 into alibaba:main Mar 21, 2026
2 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants