Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 8 additions & 3 deletions openhands-sdk/openhands/sdk/llm/options/chat_options.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,9 +39,14 @@ def select_chat_options(
if llm.reasoning_effort is not None:
out["reasoning_effort"] = llm.reasoning_effort

# All reasoning models ignore temp/top_p
out.pop("temperature", None)
out.pop("top_p", None)
# OpenAI reasoning models (o1, o3) ignore temp/top_p
# Gemini models DO respect temperature, so we only pop for OpenAI
model_lower = llm.model.lower()
# Normalize to basename so provider-prefixed IDs like "openai/o1" are handled
model_name = model_lower.split("/")[-1]
if model_name.startswith(("o1", "o3")):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it’s not only o1, o3, there are also Claudes, open models, GPTs

I’d actually suggest that since we removed temperature by default, we could remove this temperature special cases for all

The other models will also get nothing set (from the SDK), so they all should respect user choice

out.pop("temperature", None)
out.pop("top_p", None)

# Gemini 2.5-pro default to low if not set
if "gemini-2.5-pro" in llm.model:
Expand Down
Loading