End-to-End Coding Platform Built with Next.js, Vercel Sandbox, and AI SDK.
⚠️ WARNING: This project is currently in early development
Features and APIs may change. Use at your own risk.
Features · Built with · Model providers · MCP connectors · Deploy your own · Running locally
- Monaco Editor Integration
- Full-featured code editor with syntax highlighting, basic autocomplete, and multi-language support
- Real-time File Explorer
- Tree-view file browser with click-to-select functionality and live file updates
- Live Code Editing
- Edit files directly in the browser with auto-save and unsaved changes tracking
- Command Execution
- Run shell commands with real-time output streaming and error monitoring
- Auto Error Detection
- Intelligent error monitoring that automatically detects and reports issues
- MCP Connectors
- Manage and integrate Model Context Protocol (MCP) connectors
- Next.js 16 App Router
- Advanced routing for seamless navigation and performance
- React Server Components (RSCs) for server-side rendering and performance improvements
- AI SDK v5
- Unified API for generating text, structured objects, and tool calls with LLMs
- Hooks for building dynamic chat and generative user interfaces
- Vercel Sandbox
- Secure, isolated Linux containers for code execution
- Real-time preview of generated applications
- Drizzle ORM
- Type-safe SQL ORM with TypeScript support
- Database migrations and schema management
- Shadcn/UI
- Styling with Tailwind CSS
- Component primitives from Radix UI for accessibility and flexibility
This app ships with Anthropic provider as the default. However, with the AI SDK, you can switch LLM providers to OpenAI, Ollama, Cohere, and many more with just a few lines of code.
Integration with Model Context Protocol (MCP) connectors via AI SDK v5 for accessing external services and APIs. Use available preset connectors or configure your own custom MCP servers:
- Context7
- Access comprehensive library documentation and code snippets
- Real-time code examples and API references for popular frameworks
- Figma
- Integrate with Figma designs and assets
- Access design files, components, and export assets programmatically
- Hugging Face
- Access Hugging Face models and datasets
- Run inference and download pre-trained models for AI workflows
- Linear
- Manage tasks and projects in Linear
- Create issues, update statuses, and query project data
- Notion
- Access and manage Notion pages and databases
- Read and write content, query database entries, and manage blocks
- Supabase
- Interact with Supabase databases and services
- Query tables, manage rows, and access real-time subscriptions
- Browserbase
- Browser automation and web scraping capabilities
- Take screenshots, navigate pages, and extract content from websites
- Convex
- Connect to Convex backends and databases
- Query data, run functions, and interact with Convex deployments
- Playwright
- Advanced browser automation and testing
- Control browsers, interact with pages, and perform end-to-end testing
You can deploy your own version of VibeStack to Vercel with one click:
You will need to use the environment variables defined in .env.example to run VibeStack. It's recommended you use Vercel Environment Variables for this, but a .env file is all that is necessary.
Note: You should not commit your
.envfile or it will expose secrets that will allow others to control access to your various AI provider accounts.
- Install Vercel CLI:
npm i -g vercel - Link local instance with Vercel and GitHub accounts (creates
.verceldirectory):vercel link - Download your environment variables:
vercel env pull
bun install
bun devYour app should now be running on localhost:3000.