Skip to content

feat(ollama): add Ollama provider for local LLM inference#53

Merged
Rohitjoshi9023 merged 7 commits intomainfrom
feat/ollama-provider
Feb 26, 2026
Merged

feat(ollama): add Ollama provider for local LLM inference#53
Rohitjoshi9023 merged 7 commits intomainfrom
feat/ollama-provider

Conversation

@Sahil5963
Copy link
Contributor

Summary

  • Adds complete Ollama provider support in @yourgpt/llm-sdk with streaming, vision, tools, and Ollama-specific options
  • Adds full-featured ollama-demo example with React UI + Express server + CLI demos
  • Adds comprehensive documentation for Ollama provider

Changes

SDK (packages/llm-sdk/)

  • Ollama Provider: createOllama() callable provider (Vercel AI SDK style)
  • Ollama Adapter: ~330 lines - streaming, vision support, tool calling
  • 15+ models in registry with capabilities (vision, tools, context)
  • Ollama-specific options: num_ctx, mirostat, repeat_penalty, seed, top_p, top_k, etc.

Demo (examples/ollama-demo/)

  • React + Vite client with glassmorphism UI
  • Express server with /api/copilot endpoint
  • Server-side tools (weather, calculator)
  • CLI demos: chat, tools, vision, options
  • Thread persistence and custom ThreadPicker

Docs (apps/docs/)

  • providers/ollama.mdx - Setup, models, options, examples

Type of Change

  • New feature (non-breaking change that adds functionality)
  • Documentation update

Testing

  • Tested locally with Ollama (llama3.1, llava)
  • Verified streaming, tool calling, vision
  • Demo UI and CLI demos working

🤖 Generated with Claude Code

Sahil5963 and others added 5 commits February 13, 2026 20:19
Add complete Ollama integration for running LLMs locally without API keys:

- Ollama adapter with streaming, tool calling, and vision support
- Provider configuration with Ollama-specific options (num_ctx, mirostat, etc.)
- Documentation for the Ollama provider
- Demo example with Express server, chat, tools, and vision demos

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…integration

- Introduced a new React client for the Ollama demo, featuring a chat interface using `@yourgpt/copilot-sdk`.
- Updated server to handle chat and tool calls, including weather and calculation functionalities.
- Enhanced README with setup instructions and features overview.
- Added necessary configurations for Vite and TypeScript in the client.
- Included environment variables for Ollama configuration in the server.

This update provides a complete example of running a local LLM with a user-friendly interface.
@vercel
Copy link

vercel bot commented Feb 26, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
copilot-playground Error Error Feb 26, 2026 8:31am
copilot-sdk-docs Ready Ready Preview, Comment Feb 26, 2026 8:31am

Request Review

@Rohitjoshi9023 Rohitjoshi9023 merged commit c7c2811 into main Feb 26, 2026
3 of 4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants