Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/context.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ This document outlines the key technical conventions and architectural decisions
- External functionalities are integrated as Mastra Tools or via MCP (Model Context Protocol).
- Multiple agents can coexist in a single Mastra instance.
- **AI Provider:** RunPod AI SDK Provider (`@runpod/ai-sdk-provider` v0.9.0)
- Uses OpenAI GPT-OSS-120B model (`openai/gpt-oss-120b`) for agent reasoning.
- Uses Qwen3-32B model (`qwen/qwen3-32b-awq`) for agent reasoning.
- Supports streaming and non-streaming text generation.
- **Server Framework:** Hono (via Mastra's built-in server)
- **Storage:** Optional PostgreSQL with PgVector extension (defaults to in-memory)
Expand Down Expand Up @@ -125,4 +125,4 @@ When all DB credentials are provided, PostgreSQL with PgVector is used. Otherwis
- `README.md`: Hub-specific documentation displayed on the Hub page
- **Endpoint Type:** Load Balancer (`LB`) for high availability
- **Runtime:** CPU serverless (no GPU required)
- **Category:** `language` (AI/LLM category)
- **Category:** `language` (AI/LLM category)
2 changes: 1 addition & 1 deletion src/mastra/agents/weather-agent.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ export const weatherAgent = new Agent({

Use the weatherTool to fetch current weather data.
`,
model: runpod("openai/gpt-oss-120b"),
model: runpod("qwen/qwen3-32b-awq"),
tools: { weatherTool },
memory,
});
Loading