Supercharge AI Agents, Safely
-
Updated
Mar 13, 2026 - Go
Supercharge AI Agents, Safely
A retrieval-gated skill architecture for LLM agents that scales to hundreds of tools by exposing only the top-K relevant capabilities per request.
Orchestration MCP Server - Routes tool calls to MCP backends
Budget-aware context compilation and context firewall for tool-heavy AI agents.
Smart MCP router that routes tools by intent and reduces context rot for LLM hosts
An AI assistant with hot-pluggable MCP tool servers — one Router Agent, dynamic tool discovery, and LLM-powered routing. Built with FastAPI, Redis, Ollama, Streamlit, and Prometheus.
Dynamic MCP multiplexer — one gateway, many upstream MCP servers.
Hot-pluggable MCP multi-agent assistant — tool servers load on-demand, execute, and auto-detach without restart. LLM-powered routing via Ollama, Redis-backed tool registry + chat history, Prometheus/Grafana observability, and a Streamlit UI. Add new tools by dropping a Python file.
Add a description, image, and links to the tool-routing topic page so that developers can more easily learn about it.
To associate your repository with the tool-routing topic, visit your repo's landing page and select "manage topics."