Serapeum is a modular LLM toolkit. The repo contains a core package with shared LLM abstractions plus provider integrations (OpenAI, Azure OpenAI, Ollama, llama.cpp) and supporting docs and examples.
libs/core/: Provider-agnostic core (LLM interfaces, prompts, parsers, tools, structured outputs).libs/providers/: Provider adapters (OpenAI, Azure OpenAI, Ollama, llama.cpp).docs/: Architecture and reference docs (MkDocs).examples/: Usage examples and notebooks.
| Package | Location | Description |
|---|---|---|
serapeum-core |
libs/core/ |
Shared LLM interfaces, prompt templates, output parsers, tool schemas |
serapeum-openai |
libs/providers/openai/ |
OpenAI Chat Completions and Responses API adapter |
serapeum-azure-openai |
libs/providers/azure-openai/ |
Azure OpenAI deployment adapter (extends serapeum-openai) |
serapeum-ollama |
libs/providers/ollama/ |
Ollama-backed LLM and embedding adapter |
serapeum-llama-cpp |
libs/providers/llama-cpp/ |
Local GGUF model inference via llama-cpp-python |
Each package has its own README with details and examples.
This project uses uv for dependency management and workspace orchestration.
# Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
# Sync all workspace dependencies from root
uv sync --dev --active
# Or sync a specific provider
uv sync --package serapeum-openai --activeThe workspace venv lives outside the project, so use python -m pytest
directly (not uv run pytest):
# Run all tests
python -m pytest
# Run tests for a specific package
python -m pytest libs/core/tests
python -m pytest libs/providers/openai/tests
python -m pytest libs/providers/ollama/tests
python -m pytest libs/providers/llama-cpp/tests
# Skip end-to-end tests
python -m pytest -m "not e2e"Markers are defined in each package's pyproject.toml.
- Homepage: https://github.com/serapeum-org/serapeum
- Docs: https://serapeum-org.github.io/serapeum
- Security:
SECURITY.md