Skip to content

Conversation

@krozbicki
Copy link

@krozbicki krozbicki commented Oct 28, 2025

Title

feat: Add base_llm_http_handler support for GitHub Copilot chat completions

Relevant issues

Fixes #14496

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on [make test-unit]
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

✨ Feature / 🐛 Bug Fix

Changes

Problem

  1. Vision requests failing: GitHub Copilot vision requests were failing with a 400 Bad Request error due to missing Copilot-Vision-Request header
  2. No base_llm_http_handler support: GitHub Copilot couldn't work with the new base_llm_http_handler routing when EXPERIMENTAL_OPENAI_BASE_LLM_HTTP_HANDLER flag is enabled

Solution

1. Enhanced validate_environment() (fixes vision requests)

Extended the validate_environment() method in GithubCopilotConfig to add GitHub Copilot-specific headers:

  • Enhanced validate_environment() in GithubCopilotConfig (litellm/llms/github_copilot/chat/transformation.py)
    • Calls super().validate_environment() to get base OpenAI headers (Authorization, Content-Type)
    • Adds X-Initiator header based on message roles:
      • Returns "agent" if any message has role "tool" or "assistant"
      • Returns "user" otherwise
    • Adds Copilot-Vision-Request: true header when image content is detected:
      • Checks for image_url in message content
      • Checks for content items with type: "image_url"

2. Add base_llm_http_handler support

Enabled GitHub Copilot to work with the new base_llm_http_handler routing mechanism:

  • Added get_complete_url() method to construct full endpoint URL

    • Returns: https://api.githubcopilot.com/chat/completions
    • Supports dynamic api_base from authenticator
    • Handles trailing slash normalization
  • Added custom_llm_provider property returning LlmProviders.GITHUB_COPILOT

    • Ensures correct provider identification in routing logic
  • Simplified __init__() method

    • Removed 3 unused parameters: api_key, api_base, custom_llm_provider
    • These were dead code as the property always overrides parameter value
    • Aligns with patterns from 30+ other providers in the codebase
  • Fixed test to use mock_response parameter

    • test_completion_github_copilot_mock_response now uses mock_response instead of patching
    • More robust and simpler test implementation

Impact

  • Minimal & Isolated: Changes only affect GitHub Copilot provider
  • Backward Compatible: Works with both standard routing and new base_llm_http_handler
  • Feature Complete:
    • ✅ Vision requests now work properly
    • ✅ Compatible with EXPERIMENTAL_OPENAI_BASE_LLM_HTTP_HANDLER flag
    • ✅ Maintains all existing functionality

Test Results

======================================== test session starts ========================================
platform linux -- Python 3.10.12, pytest-7.4.4, pluggy-1.5.0 -- /home/krzysiek/projects/litellm/.venv/bin/python
cachedir: .pytest_cache
rootdir: /home/krzysiek/projects/litellm
configfile: pyproject.toml
plugins: mock-3.14.1, asyncio-0.21.2, respx-0.22.0, requests-mock-1.12.1, anyio-4.5.2
asyncio: mode=auto
collected 20 items

tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_completion_github_copilot_mock_response PASSED [  5%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_copilot_vision_request_header_text_only PASSED [ 10%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_copilot_vision_request_header_with_image PASSED [ 15%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_copilot_vision_request_header_with_type_image_url PASSED [ 20%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_custom_llm_provider_property PASSED [ 25%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_get_complete_url_strips_trailing_slash PASSED [ 30%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_get_complete_url_with_authenticator_api_base PASSED [ 35%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_get_complete_url_with_custom_api_base PASSED [ 40%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_get_complete_url_with_default_api_base PASSED [ 45%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_get_supported_openai_params_case_insensitive PASSED [ 50%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_get_supported_openai_params_claude_model PASSED [ 55%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_github_copilot_config_get_openai_compatible_provider_info PASSED [ 60%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_transform_messages_disable_copilot_system_to_assistant PASSED [ 65%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_x_initiator_header_agent_request_with_assistant PASSED [ 70%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_x_initiator_header_agent_request_with_tool PASSED [ 75%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_x_initiator_header_empty_messages PASSED [ 80%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_x_initiator_header_mixed_messages_with_agent_roles PASSED [ 85%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_x_initiator_header_system_only_messages PASSED [ 90%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_x_initiator_header_user_only_messages PASSED [ 95%]
tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py::test_x_initiator_header_user_request PASSED [100%]

================================== 20 passed, 3 warnings in 0.98s ===================================

All 20 tests passing:

  • 13 existing tests (vision headers, X-Initiator header logic, message transformation, Claude parameter support, etc.)
  • 6 new tests added in this PR:
    • test_custom_llm_provider_property - verifies property returns correct provider
    • test_get_complete_url_with_default_api_base - default URL construction
    • test_get_complete_url_with_custom_api_base - custom api_base handling
    • test_get_complete_url_with_authenticator_api_base - dynamic api_base from authenticator
    • test_get_complete_url_strips_trailing_slash - trailing slash normalization
  • 1 modified test:
    • test_completion_github_copilot_mock_response - now uses mock_response parameter instead of patching

Type checking: mypy passes with no issues

Files Changed

  • litellm/llms/github_copilot/chat/transformation.py - Enhanced validate_environment(), added get_complete_url(), custom_llm_provider property, simplified __init__()
  • tests/test_litellm/llms/github_copilot/test_github_copilot_transformation.py - Added 6 new tests, modified 1 existing test

@vercel
Copy link

vercel bot commented Oct 28, 2025

@krozbicki is attempting to deploy a commit to the CLERKIEAI Team on Vercel.

A member of the Team first needs to authorize it.

@krozbicki krozbicki force-pushed the fix/github-copilot-vision-headers branch from 23badea to 1ea3228 Compare November 29, 2025 14:29
…dation()

- Added requires_custom_headers_validation() method to BaseConfig (default False)
- Override in GithubCopilotConfig to return True
- Modified OpenAIChatCompletion.completion() to conditionally call validate_environment()
- Fixes missing 'Copilot-Vision-Request' header for vision requests
- Only affects providers that explicitly require custom headers validation
- Test that GithubCopilotConfig.requires_custom_headers_validation() returns True
- Test that default BaseConfig implementation returns False
- All 17 tests in test_github_copilot_transformation.py passing
…sed approach

- Remove requires_custom_headers_validation() method from BaseConfig
- Remove requires_custom_headers_validation() override from GithubCopilotConfig
- Always call validate_environment() in OpenAI completion flow
- Remove related tests for the removed flag method
@krozbicki krozbicki force-pushed the fix/github-copilot-vision-headers branch from 1ea3228 to bf43af0 Compare November 29, 2025 14:31
…letions

- Add get_complete_url() method to construct full endpoint URL for base_llm_http_handler
- Add custom_llm_provider property returning LlmProviders.GITHUB_COPILOT
- Simplify __init__() to take no parameters (api_key, api_base, custom_llm_provider removed)
  - These parameters were unused as property always overrides parameter value
  - Aligns with patterns from 30+ other providers in the codebase
- Maintain validate_environment() with super() call for proper authentication
- Add 5 new tests for get_complete_url() and custom_llm_provider property
- Fix test_completion_github_copilot_mock_response to use mock_response parameter
- All 50 tests passing, mypy type checking successful

This enables GitHub Copilot to work with base_llm_http_handler when
EXPERIMENTAL_OPENAI_BASE_LLM_HTTP_HANDLER flag is enabled, while maintaining
backward compatibility with standard routing.
…response

Add @patch decorators for Authenticator.get_api_key and get_api_base
to prevent real OAuth authentication attempts during mock response test.

Fixes GitHub Actions test failure:
'GetLLMProvider Exception - Failed to get access token after 3 attempts'
@vercel
Copy link

vercel bot commented Dec 2, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
litellm Ready Ready Preview Comment Dec 2, 2025 8:50pm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: Requests to GitHub Copilot model fails with 400 Bad Request for image attachments

2 participants