The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
-
Updated
Oct 27, 2025 - JavaScript
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Generate a timeline of your day, automatically
kubewall - Single-Binary Kubernetes Dashboard with Multi-Cluster Management & AI Integration. (OpenAI / Claude 4 / Gemini / DeepSeek / OpenRouter / Ollama / Qwen / LMStudio)
ComfyUI-IF_AI_tools is a set of custom nodes for ComfyUI that allows you to generate prompts using a local Large Language Model (LLM) via Ollama. This tool enables you to enhance your image generation workflow by leveraging the power of language models.
RAGLight is a modular framework for Retrieval-Augmented Generation (RAG). It makes it easy to plug in different LLMs, embeddings, and vector stores, and now includes seamless MCP integration to connect external tools and data sources.
DevoxxGenie is a plugin for IntelliJ IDEA that uses local LLM's (Ollama, LMStudio, GPT4All, Jan and Llama.cpp) and Cloud based LLMs to help review, test, explain your project code.
A simple, locally hosted Web Search MCP server for use with Local LLMs
OADIN is a Production-grade AI service infrastructure for AI PC development. Provides unified APIs for chat, embedding, text generation, and text-to-image services with support for both local (Ollama) and cloud AI providers (OpenAI, DeepSeek, etc.). Enables AI applications to share resources without bundling their own AI stacks.
A persistent local memory for AI, LLMs, or Copilot in VS Code.
Python app for LM Studio-enhanced voice conversations with local LLMs. Uses Whisper for speech-to-text and offers a privacy-focused, accessible interface.
High-performance lightweight proxy and load balancer for LLM infrastructure. Intelligent routing, automatic failover and unified model discovery across local and remote inference backends.
MESH-AI — Off-Grid AI + Mesh Router for Meshtastic - Seamlessly connect LM Studio, Ollama, OpenAI, 3rd-party APIs, & Home Assistant to your LoRa mesh. Supports custom commands, Twilio SMS (inbound/outbound), Discord channel routing, & GPS emergency alerts via SMS, email, or Discord + SO MUCH MORE!
RetroChat is a powerful command-line interface for interacting with various AI language models. It provides a seamless experience for engaging with different chat providers while offering robust features for managing and customizing your conversations. The code in this repo is 100% AI generated. Nothing has been written by a human.
Add a description, image, and links to the lmstudio topic page so that developers can more easily learn about it.
To associate your repository with the lmstudio topic, visit your repo's landing page and select "manage topics."