VectorChat is an AI-powered agentic platform featuring retrieval-augmented generation (RAG), document ingestion, and agent tooling with skills management for advanced and scallable workflows.
This repo contains:
cmd/+internal/+pkg/: Go API and workersfrontend/: Nuxt 3 main UIvectorchat-light/: a lighter Nuxt UIory/: Kratos/Hydra/Oathkeeper configservices/: internal service containers (e.g., MarkItDown API, code runner)docker-compose.yml: full local stack
Prereqs:
- Docker + Docker Compose
- LLM API keys (OpenAI/Gemini/etc.)
- Create your env files:
cp .env.example .env
cp litellm/.env.example litellm/.env
cp vectorchat-light/.env.example vectorchat-light/.env-
Fill in the required values in
.env,litellm/.env, andvectorchat-light/.env. -
Start the full stack:
docker compose up --build- Open the apps:
- Main UI: http://localhost:3000
- VectorChat Light: http://localhost:3100
- API gateway (Oathkeeper): http://localhost:4456
- Mailhog (dev email): http://localhost:8025
- pgAdmin: http://localhost:5050 (admin@example.com / admin)
- LiteLLM: http://localhost:4000
Use .env.example as the source of truth. Common required values:
OPENAI_API_KEYGEMINI_API_KEY(optional, for multi-provider setups)
GITHUB_IDGITHUB_SECRET
BASE_URL(Oathkeeper public base, defaultlocalhost:4456)FRONTEND_URL(defaulthttp://localhost:3000)IS_SSL(defaultfalse)
MIGRATIONS_PATH(default./internal/db/migrations)
STRIPE_API_KEY,STRIPE_WEBHOOK_SECRETGOOGLE_CALENDAR_CLIENT_ID,GOOGLE_CALENDAR_CLIENT_SECRET,GOOGLE_CALENDAR_REDIRECT_URLGMAIL_CLIENT_ID,GMAIL_CLIENT_SECRET,GMAIL_REDIRECT_URLSLACK_CLIENT_ID,SLACK_CLIENT_SECRET,SLACK_REDIRECT_URLENCRYPTION_KEY(base64 32 bytes)
Additional env files:
litellm/.envis used by LiteLLM for provider keys.vectorchat-light/.envconfigures the light UI client and OAuth credentials.
The Docker Compose stack includes:
app: Go API server (Fiber) + core servicesfrontend: Nuxt 3 main UIvectorchat-light: lightweight Nuxt UIpostgres: PostgreSQL 14 + pgvectorlitellm: LLM proxynats: JetStream queueoathkeeper,kratos,hydra: Ory auth stackcrawl4ai,markitdown: web scraping + doc processingexecute-code-worker: sandboxed code runnercrawl-scheduler: background worker for crawl jobsmailhog: local SMTP inboxpgadmin: database UI
API docs are served from the API root when the server is running:
- API docs: http://localhost:4456/ (Oathkeeper gateway)
- Aliases:
/docs,/reference,/swagger - OpenAPI:
/openapi.json
Authentication flow (local):
- Get a bearer token from
POST /public/oauth/tokenthrough Oathkeeper - Click “Authorize” in the docs and paste
Bearer <token>
make build # docker compose build
make run # docker compose up -d
make stop # docker compose down
make clean # docker compose down -v
make migrate # run goose migrations inside the app container
make swagger # regenerate OpenAPI docs- The backend follows a layered architecture: Handler → Service → Repository.
- SQL lives in
internal/db/and usessqlx. - Auth is handled by Ory. Do not build custom auth flows.
- The Nuxt apps use
useFetchand the Ory public URL for auth.
- Auth issues: ensure Ory containers are healthy and
.envhasGITHUB_ID/GITHUB_SECRET. - LLM errors: check
OPENAI_API_KEYin.envandlitellm/.env. - DB errors: verify Postgres is healthy and migrations ran (
make migrate).
See LICENSE.