-
Notifications
You must be signed in to change notification settings - Fork 47
Fixes #211 with_structured_output for json_schema method #238
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -975,10 +975,9 @@ class AnswerWithJustification(BaseModel): | |
| response_format = { | ||
| "type": "json_schema", | ||
| "json_schema": { | ||
| "name": kwargs.get("schema_name", "json_schema"), | ||
| "strict": True, | ||
| "schema": ( | ||
| schema.model_json_schema() if is_pydantic_schema else schema # type: ignore[union-attr] | ||
| ), | ||
| "schema": schema.model_json_schema() if is_pydantic_schema else schema, # type: ignore[union-attr] | ||
| }, | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. the pr is also missing unit test coverage - we have an existing test at in which this line can you update this test and include asserts for the name/strict/schema fields? may look something like: can you also add the follow unit tests:
|
||
| } | ||
| llm = self.bind(response_format=response_format) | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the name should be derived and not hardcoded - here you should use convert_to_openai_function() ,(add an additional import from the langchain core function calling class for
from langchain_core.utils.function_calling import convert_to_openai_function. we should also do this to address the other cascading errors (see my overall review comment)can refer to how the partner library is doing this as a reference which calls convert_to_openai_function:
https://github.com/langchain-ai/langchain/blob/202d7f6c4a2ca8c7e5949d935bcf0ba9b0c23fb0/libs/partners/openai/langchain_openai/chat_models/base.py#L1449
suggested fix may look something like this, but please be sure to test!