An embeddable chat widget that lets your AI chatbots render rich, interactive UI like buttons, forms, charts, cards and more instead of plain text. Works out of the box with LangGraph/LangChain and n8n.
- 🎨 Beautiful UI - Clean, modern chat interface
- 🚀 Easy Integration - Single script tag or npm package
- 💬 Session Management - Automatic session handling with persistent threads
- 💾 Flexible Storage - localStorage, LangGraph-managed, or in-memory
- 🗂️ Thread Management - Create, switch, and delete conversation threads
- 🌓 Theme Support - Light and dark mode
- 📱 Responsive - Works perfectly on mobile and desktop
- 🔌 Multi-Provider - LangGraph, n8n, Make.com, or custom webhooks
- 🌊 Streaming Support - Real-time streaming responses from your backend
- 📐 Multiple Layouts - Full-page, side panel, or bottom tray form factors
<link
href="https://cdn.jsdelivr.net/npm/genui-widget/dist/genui-widget.css"
rel="stylesheet"
/>
<script type="module">
import { createChat } from "https://cdn.jsdelivr.net/npm/genui-widget/dist/genui-widget.es.js";
createChat({
langgraph: {
deploymentUrl: "https://your-deployment.langraph.app",
assistantId: "your-assistant-id",
},
storageType: "langgraph", // Use LangGraph's built-in thread management
agentName: "Assistant",
});
</script><link
href="https://cdn.jsdelivr.net/npm/genui-widget/dist/genui-widget.css"
rel="stylesheet"
/>
<script type="module">
import { createChat } from "https://cdn.jsdelivr.net/npm/genui-widget/dist/genui-widget.es.js";
createChat({
n8n: {
webhookUrl: "YOUR_WEBHOOK_URL",
enableStreaming: true, // Optional: enable streaming responses
},
storageType: "localstorage", // Persist chats locally
agentName: "Assistant",
});
</script>See Quick Start above.
npm install genui-widgetimport { createChat } from "genui-widget";
// With LangGraph
const chat = createChat({
langgraph: {
deploymentUrl: "https://your-deployment.langraph.app",
assistantId: "your-assistant-id",
},
storageType: "langgraph",
});
// OR with n8n/webhooks
const chat = createChat({
n8n: {
webhookUrl: "YOUR_WEBHOOK_URL",
enableStreaming: true,
},
storageType: "localstorage",
});const chat = createChat({
// Provider configuration (choose one)
langgraph: {
deploymentUrl: "https://your-deployment.langraph.app",
assistantId: "your-assistant-id",
},
// OR
n8n: {
webhookUrl: "https://your-webhook-endpoint.com/chat",
enableStreaming: true, // Optional: Enable streaming responses
},
// Optional settings
agentName: "Assistant", // Bot/agent name
logoUrl: "https://example.com/logo.png", // Logo image URL
theme: { mode: "light" }, // 'light' or 'dark'
storageType: "langgraph", // 'none', 'localstorage', or 'langgraph'
formFactor: "full-page", // 'full-page', 'side-panel', or 'bottom-tray'
enableDebugLogging: false, // Enable console debug logging
// Optional: Callbacks
onSessionStart: (sessionId) => {
console.log("Session started:", sessionId);
},
onError: (error) => {
console.error("Chat error:", error);
},
});storageType: "none" (default):
- Messages work normally during the session
- All data is lost on page refresh
- Best for: Simple use cases, demos, or privacy-focused applications
storageType: "localstorage":
- Chat conversations persist across page refreshes
- Users can create and manage multiple threads
- Thread history is saved to browser localStorage
- Best for: n8n/webhook integrations without built-in persistence
storageType: "langgraph":
- Leverages LangGraph's built-in thread management
- Conversations persist server-side across devices
- Requires
langgraphprovider configuration - Thread operations (create, delete, update) sync with LangGraph API
- Best for: LangGraph deployments requiring cross-device sync
// Get current session ID
const sessionId = chat.getSessionId();
// Open the chat window
chat.open();
// Close the chat window
chat.close();
// Destroy the widget completely
chat.destroy();The widget integrates seamlessly with LangGraph deployments (Cloud or self-hosted).
Configuration:
createChat({
langgraph: {
deploymentUrl: "https://your-deployment.langraph.app",
assistantId: "your-assistant-id",
},
storageType: "langgraph", // Recommended for LangGraph
});Features:
- ✅ Automatic Thread Management - Creates and manages threads via LangGraph API
- ✅ Server-Side Persistence - Conversations persist across devices
- ✅ Streaming Support - Real-time streaming via Server-Sent Events (SSE)
- ✅ Message History - Fetches and displays conversation history
- ✅ Thread Operations - Create, update, delete threads with metadata
How it works:
- Widget calls
POST /threadsto create new conversation threads - Messages sent via
POST /threads/{thread_id}/runs/streamwith streaming enabled - Thread history retrieved via
GET /threads/{thread_id}/history - Thread list fetched via
POST /threads/search
The LangGraph provider automatically handles the streaming response format and extracts message content from the SSE events.
The chat client sends POST requests to your webhook:
{
"chatInput": "User's message here",
"sessionId": "uuid-v4-session-id"
}Non-streaming mode:
{
"output": "Your bot's response here"
}Streaming mode (enableStreaming: true):
Return line-delimited JSON chunks:
{ "type": "item", "content": "First chunk " }
{ "type": "item", "content": "second chunk " }
{ "type": "item", "content": "final chunk" }
Follow the instructions at thesys.dev/n8n to quickly set up your n8n workflow.
Complete list of all available options:
You must configure either langgraph OR n8n (not both):
langgraph?: {
// Required: Your LangGraph deployment URL
deploymentUrl: string;
// Required: The assistant ID to use
assistantId: string;
}Use this for LangGraph Cloud or self-hosted deployments. When using LangGraph, set storageType: "langgraph" to leverage server-side thread management.
n8n?: {
// Required: Your webhook URL
webhookUrl: string;
// Optional: Enable streaming responses (default: false)
enableStreaming?: boolean;
// Optional: Custom webhook configuration
webhookConfig?: {
method?: string; // HTTP method (default: "POST")
headers?: Record<string, string>; // Custom headers
};
}Use this for n8n, Make.com, or any custom webhook endpoint.
agentName?: string; // Default: "Assistant"The name displayed for the bot/agent in the chat interface.
logoUrl?: string;URL to a logo image that will be displayed in the chat interface.
enableDebugLogging?: boolean; // Default: falseEnable debug logging to the browser console. Useful for troubleshooting webhook integration issues.
theme?: {
mode: 'light' | 'dark'; // Default: 'light'
}Sets the color scheme for the chat interface.
storageType?: 'none' | 'localstorage' | 'langgraph'; // Default: 'none'Controls chat history persistence:
'none'- Messages are kept in memory only, lost on page refresh'localstorage'- Messages are saved to browser localStorage, persist across sessions'langgraph'- Uses LangGraph's server-side thread management (requireslanggraphprovider)
formFactor?: 'full-page' | 'side-panel' | 'bottom-tray'; // Default: 'full-page'Controls the layout form factor:
'full-page'- Takes up the entire viewport'side-panel'- Displays as a side panel on the right'bottom-tray'- Appears as a collapsible tray at the bottom of the screen
Note: The
modeproperty is deprecated. UseformFactorinstead ('fullscreen'→'full-page','sidepanel'→'side-panel').
bottomTray?: {
// Control the open state of the bottom tray (controlled mode)
isOpen?: boolean;
// Callback when bottom tray open state changes
onOpenChange?: (isOpen: boolean) => void;
// Default open state for bottom tray (uncontrolled mode)
defaultOpen?: boolean;
}Configuration options specific to the bottom-tray form factor. Only used when formFactor is set to 'bottom-tray'.
Example usage:
createChat({
n8n: { webhookUrl: "YOUR_WEBHOOK_URL" },
formFactor: "bottom-tray",
bottomTray: {
defaultOpen: false, // Start collapsed
onOpenChange: (isOpen) => {
console.log("Tray is now:", isOpen ? "open" : "closed");
},
},
});Controlled vs Uncontrolled mode:
- Uncontrolled: Use
defaultOpento set initial state, let the widget manage it - Controlled: Use
isOpenandonOpenChangeto fully control the tray state externally
onSessionStart?: (sessionId: string) => void;Callback function that fires when a new chat session is created. Receives the session ID as a parameter. Useful for analytics or tracking.
onError?: (error: Error) => void;Callback function that fires when an error occurs during message processing. Useful for logging, analytics, or custom error handling. Note that the widget will still display error states in the chat UI automatically.
- Check browser console for errors (enable
enableDebugLogging: true) - Verify provider configuration is correct (LangGraph URL/assistant ID or webhook URL)
- Ensure endpoint is active and accessible
- Check CORS settings
For LangGraph:
- Verify
deploymentUrlandassistantIdare correct - Check that the LangGraph deployment is running
- Ensure your assistant is deployed and accessible
For n8n/webhooks:
- Verify webhook URL is correct
- Check CORS configuration
- Ensure your domain is allowlisted (for n8n)
- Test webhook endpoint independently
For LangGraph:
- Check that streaming is working (SSE connection)
- Verify assistant is responding correctly
- Check thread creation/access permissions
For n8n/webhooks:
- Verify response format:
{ "output": "message" } - For streaming, ensure line-delimited JSON format
- Check webhook execution logs
- Enable
enableStreamingif using streaming responses
- If using
storageType: "langgraph", ensure LangGraph provider is configured - For localStorage, check browser storage isn't full
- Clear localStorage if you encounter corrupted state:
localStorage.clear()
- Node.js: Version 20.9.0 or higher (for development)
- Chrome/Edge (latest 2 versions)
- Firefox (latest 2 versions)
- Safari (latest 2 versions)
- Mobile browsers (iOS Safari, Chrome Mobile)
MIT
- GitHub Issues: Create an issue
- Documentation: View docs
Contributions are welcome! Please feel free to submit a Pull Request.