feat: Enhanced sampling with modelPreferences (MCP 2025-11-25)#86
feat: Enhanced sampling with modelPreferences (MCP 2025-11-25)#86
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 9a4ad74006
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| @sampling_handler.call(messages, model_preferences, system_prompt, max_tokens, | ||
| { 'includeContext' => include_context, 'temperature' => temperature, |
There was a problem hiding this comment.
Preserve optional-arity handlers when dispatching sampling args
The new arity dispatch treats every non-0..4 handler as "5+" and always passes an extra hash, but Ruby lambdas with optional arguments report negative arity (for example ->(messages, prefs, prompt, max_tokens = nil) { ... } has arity -4). Those handlers now fall into this else branch and receive 5 arguments, raising ArgumentError (expected 3..4) and causing sampling requests to fail at runtime; before this change they received 4 arguments and worked.
Useful? React with 👍 / 👎.
d90bfab to
1c98f4b
Compare
1c98f4b to
5406135
Compare
Adds modelPreferences (hints, costPriority, speedPriority, intelligencePriority) and systemPrompt handling.