This simple Model Context Protocol (MCP) client demonstrates the use of MCP server tools by LangChain ReAct Agent.
It leverages a utility function convert_mcp_to_langchain_tools() from
langchain_mcp_tools.
This function handles parallel initialization of specified multiple MCP servers
and converts their available tools into a list of LangChain-compatible tools
(list[BaseTool]).
Google GenAI's gemini-2.5-flash is used as the LLM.
For convenience, code for OpenAI's and Anthropic's LLMs are also included and commented out.
A bit more realistic (conversational) MCP Client is available here
A typescript equivalent of this MCP client is available here
- Python 3.11+
- [optional]
uv(uvx) installed to run Python package-based MCP servers - [optional] npm 7+ (
npx) to run Node.js package-based MCP servers - LLM API keys from OpenAI, Anthropic, and/or Google GenAI as needed
-
Install dependencies:
make install
-
Setup API key:
cp .env.template .env
- Update
.envas needed. .gitignoreis configured to ignore.envto prevent accidental commits of the credentials.
- Update
-
Run the app:
make start
It takes a while on the first run.
A simple example of showing how to implement an OAuth client provider and
use it with the langchain-mcp-tools library can be found
in src/streamable_http_oauth_test_client.py.
For testing purposes, a sample MCP server with OAuth authentication support
that works with the above client is provided
in src/streamable_http_oauth_test_server.py.
You can run the server with make run-streamable-http-oauth-test-server
and the client with make run-streamable-http-oauth-test-client.