-
Notifications
You must be signed in to change notification settings - Fork 79
Add support for OpenAI Responses API to all OpenAI-related plugins #1447
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
@copilot looks solid, implement it |
Co-authored-by: waldekmastykarz <11164679+waldekmastykarz@users.noreply.github.com>
Implementation complete! ✅ All five plugins now support the OpenAI Responses API alongside existing APIs. Commit: 1f426b5 Key changes:
The implementation follows the existing patterns and prioritizes the Responses API in detection as recommended by OpenAI. |
|
@copilot I updated the branch with latest changes. Review your PR and ensure that it fits with the latest changes |
Reviewed the PR after the branch update. The merge with main was successful and all changes are compatible:
No additional changes needed - the PR is ready for merge. |
Implementation Complete: OpenAI Responses API Support
Successfully added support for the OpenAI Responses API (
/v1/responses) to all five OpenAI-related plugins while maintaining full backward compatibility with existing Chat Completions and Completions APIs.Changes Made
1. New Model Classes (
DevProxy.Abstractions/LanguageModel/OpenAIModels.cs)OpenAIResponsesRequest- Handles request withinput,modalities,instructions,store,previous_response_id,tools, andmax_output_tokensOpenAIResponsesResponse- Parses response withoutputitems array,created_at, andstatusOpenAIResponsesOutputItem- Represents output items withtype,role, andcontentarrayOpenAIResponsesContentPart- Represents content parts withtypeandtextOpenAIRequest.TryGetOpenAIRequest()to detect Responses API requests (checks forinput+modalitiesfields)2. LanguageModelFailurePlugin ✅
TryGetOpenAIRequestmethod in favor of shared method3. LanguageModelRateLimitingPlugin ✅
Usageproperty)TryGetOpenAIRequestmethod4. OpenAITelemetryPlugin ✅
OpenAIResponsesRequestcase to request type detectionAddResponsesApiRequestTags()- tags operation as "responses", includes prompt, instructions, modalitiesAddResponsesApiResponseTags()- extracts usage, status, and completion text from output itemsGetOperationName()to return "responses" for Responses API5. OpenAIUsageDebuggingPlugin ✅
Usageobject which has same structure across all APIs6. OpenAIMockResponsePlugin ✅
ConvertResponsesInputToMessages()- converts Responses API input to Chat Completions format for local LLMConvertToResponsesApiResponse()- transforms local LLM response to Responses API format with proper output itemsExtractTextFromContent()helper - extracts text from various content formatsTryGetOpenAIRequestmethodKey Features
✅ Full Backward Compatibility - All existing Chat Completions and Completions functionality unchanged
✅ Unified Detection - Single
OpenAIRequest.TryGetOpenAIRequest()method handles all request types✅ Proper Priority - Responses API checked first (recommended API per OpenAI docs)
✅ Usage Tracking - Token usage, costs, and rate limiting work across all APIs
✅ Telemetry Support - Full OpenTelemetry integration for Responses API
✅ Mock Support - Local LLM mocking with automatic format conversion
✅ No SDKs - Pure C# implementation following existing patterns
Testing
All five plugins now support the OpenAI Responses API alongside existing APIs! 🎉
Original prompt
💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.