Skip to content

Pass all the parameters allowed by streamablehttp_client via DatabricksMCPClient for better control over headers, timeout, etc #253

@yenlow

Description

@yenlow

We need to allow more parameters (e.g. headers, timeout, terminate_on_close) to be passed into streamablehttp_client used by DatabricksMCPClient. Patching the DatabricksMCPClient methods using streamablehttp_client with **kwargs solved the following zstd decoding error.

Running:

from databricks_mcp import DatabricksMCPClient
import nest_asyncio
import asyncio

nest_asyncio.apply()

server_url = "https://e2-demo-field-eng.cloud.databricks.com/api/2.0/mcp/external/conn_aichemy_pubchem"
#server_url = "https://mcp-nitin-1444828305810485.aws.databricksapps.com/mcp"
mcp_client = DatabricksMCPClient(server_url=server_url, workspace_client=ws_client)
mcp_client.list_tools()

Error returned:

Error parsing JSON response
Traceback (most recent call last):
  File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-588c2647-b519-45f9-bf41-fa08778d1b83/lib/python3.12/site-packages/httpx/_decoders.py", line 185, in decode
    output.write(self.decompressor.decompress(data))
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
zstd.ZstdError: zstd decompressor error: Unknown frame descriptor

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-588c2647-b519-45f9-bf41-fa08778d1b83/lib/python3.12/site-packages/mcp/client/streamable_http.py", line 384, in _handle_json_response
    content = await response.aread()
              ^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-588c2647-b519-45f9-bf41-fa08778d1b83/lib/python3.12/site-packages/httpx/_models.py", line 979, in aread
    self._content = b"".join([part async for part in self.aiter_bytes()])
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-588c2647-b519-45f9-bf41-fa08778d1b83/lib/python3.12/site-packages/httpx/_models.py", line 998, in aiter_bytes
    decoded = decoder.decode(raw_bytes)
              ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-588c2647-b519-45f9-bf41-fa08778d1b83/lib/python3.12/site-packages/httpx/_decoders.py", line 191, in decode
    raise DecodingError(str(exc)) from exc
httpx.DecodingError: zstd decompressor error: Unknown frame descriptor

Solution:
Adding a specific header solved the above error. Please also pass in kwargs like timeout for greater control.
https://github.com/databricks/databricks-ai-bridge/blob/main/databricks_mcp/src/databricks_mcp/mcp.py#L135

    async def _get_tools_async(self, **kwargs) -> List[Tool]:
        """Fetch tools from the MCP endpoint asynchronously."""
        async with streamablehttp_client(
            url=self.server_url,
            auth=DatabricksOAuthClientProvider(self.client),
            headers={"Accept-Encoding": "identity"},
            **kwargs,
        ) as (read_stream, write_stream, _):
            async with ClientSession(read_stream, write_stream) as session:
                await session.initialize()
                return (await session.list_tools()).tools

mcp_client.list_tools(timeout=60, terminate_on_close=False) now successfully returns:

[Tool(name='search_compounds', title=None, description='Search PubChem database for compounds by name, CAS number, formula, or identifier', inputSchema={'type': 'object', 'properties': {'query': {'type': 'string', 'description': 'Search query (compound name, CAS, formula, or identifier)'}, 'search_type': {'type': 'string', 'enum': ['name', 'smiles', 'inchi', 'sdf', 'cid', 'formula'], 'description': 'Type of search to perform (default: name)'}, 'max_records': {'type': 'number', 'description': 'Maximum number of results (1-10000, default: 100)', 'minimum': 1, 'maximum': 10000}}, 'required': ['query']}, outputSchema=None, icons=None, annotations=None, meta=None, execution=None),
...]

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions