-
-
Notifications
You must be signed in to change notification settings - Fork 4.9k
Fix/GitHub copilot vision headers #16012
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
krozbicki
wants to merge
7
commits into
BerriAI:main
Choose a base branch
from
krozbicki:fix/github-copilot-vision-headers
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Fix/GitHub copilot vision headers #16012
krozbicki
wants to merge
7
commits into
BerriAI:main
from
krozbicki:fix/github-copilot-vision-headers
+152
−37
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
|
@krozbicki is attempting to deploy a commit to the CLERKIEAI Team on Vercel. A member of the Team first needs to authorize it. |
23badea to
1ea3228
Compare
…dation() - Added requires_custom_headers_validation() method to BaseConfig (default False) - Override in GithubCopilotConfig to return True - Modified OpenAIChatCompletion.completion() to conditionally call validate_environment() - Fixes missing 'Copilot-Vision-Request' header for vision requests - Only affects providers that explicitly require custom headers validation
- Test that GithubCopilotConfig.requires_custom_headers_validation() returns True - Test that default BaseConfig implementation returns False - All 17 tests in test_github_copilot_transformation.py passing
…sed approach - Remove requires_custom_headers_validation() method from BaseConfig - Remove requires_custom_headers_validation() override from GithubCopilotConfig - Always call validate_environment() in OpenAI completion flow - Remove related tests for the removed flag method
1ea3228 to
bf43af0
Compare
…letions - Add get_complete_url() method to construct full endpoint URL for base_llm_http_handler - Add custom_llm_provider property returning LlmProviders.GITHUB_COPILOT - Simplify __init__() to take no parameters (api_key, api_base, custom_llm_provider removed) - These parameters were unused as property always overrides parameter value - Aligns with patterns from 30+ other providers in the codebase - Maintain validate_environment() with super() call for proper authentication - Add 5 new tests for get_complete_url() and custom_llm_provider property - Fix test_completion_github_copilot_mock_response to use mock_response parameter - All 50 tests passing, mypy type checking successful This enables GitHub Copilot to work with base_llm_http_handler when EXPERIMENTAL_OPENAI_BASE_LLM_HTTP_HANDLER flag is enabled, while maintaining backward compatibility with standard routing.
…response Add @patch decorators for Authenticator.get_api_key and get_api_base to prevent real OAuth authentication attempts during mock response test. Fixes GitHub Actions test failure: 'GetLLMProvider Exception - Failed to get access token after 3 attempts'
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Title
feat: Add base_llm_http_handler support for GitHub Copilot chat completions
Relevant issues
Fixes #14496
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/directorymake test-unit]Type
✨ Feature / 🐛 Bug Fix
Changes
Problem
Copilot-Vision-Requestheaderbase_llm_http_handlerrouting whenEXPERIMENTAL_OPENAI_BASE_LLM_HTTP_HANDLERflag is enabledSolution
1. Enhanced validate_environment() (fixes vision requests)
Extended the
validate_environment()method inGithubCopilotConfigto add GitHub Copilot-specific headers:validate_environment()inGithubCopilotConfig(litellm/llms/github_copilot/chat/transformation.py)super().validate_environment()to get base OpenAI headers (Authorization, Content-Type)X-Initiatorheader based on message roles:"agent"if any message has role"tool"or"assistant""user"otherwiseCopilot-Vision-Request: trueheader when image content is detected:image_urlin message contenttype: "image_url"2. Add base_llm_http_handler support
Enabled GitHub Copilot to work with the new
base_llm_http_handlerrouting mechanism:Added
get_complete_url()method to construct full endpoint URLhttps://api.githubcopilot.com/chat/completionsAdded
custom_llm_providerproperty returningLlmProviders.GITHUB_COPILOTSimplified
__init__()methodapi_key,api_base,custom_llm_providerFixed test to use mock_response parameter
test_completion_github_copilot_mock_responsenow usesmock_responseinstead of patchingImpact
base_llm_http_handlerEXPERIMENTAL_OPENAI_BASE_LLM_HTTP_HANDLERflagTest Results
All 20 tests passing:
test_custom_llm_provider_property- verifies property returns correct providertest_get_complete_url_with_default_api_base- default URL constructiontest_get_complete_url_with_custom_api_base- custom api_base handlingtest_get_complete_url_with_authenticator_api_base- dynamic api_base from authenticatortest_get_complete_url_strips_trailing_slash- trailing slash normalizationtest_completion_github_copilot_mock_response- now usesmock_responseparameter instead of patchingType checking:
mypypasses with no issuesFiles Changed
litellm/llms/github_copilot/chat/transformation.py- Enhancedvalidate_environment(), addedget_complete_url(),custom_llm_providerproperty, simplified__init__()tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py- Added 6 new tests, modified 1 existing test