Open-Source Analytics Platform with AI-Powered Natural Language Querying
NQRust-Analytics is a comprehensive analytics platform that enables users to query databases using natural language, generate accurate SQL queries, create visualizations, and gain AI-powered insights.
- Natural Language to SQL: Convert plain English questions into accurate SQL queries
- Multi-Database Support: Connect to various data sources seamlessly
- AI-Powered Insights: Get intelligent analysis and recommendations
- Interactive Visualizations: Auto-generate charts and dashboards
- Semantic Layer: MDL (Modeling Definition Language) for consistent data definitions
- Real-time Analytics: Process and analyze data in real-time
- User Authentication: Built-in user registration and login with email/password
- OAuth Support: Optional Google and GitHub OAuth integration
- Role-Based Access Control (RBAC): Admin, Editor, and Viewer roles
- Multi-Dashboard Support: Create and manage multiple dashboards per user
- Dashboard Sharing: Share dashboards with other users (view or edit permissions)
- Chat History Sharing: Share conversation threads with team members
- Starred Dashboards: Bookmark frequently used dashboards
- No SQL Knowledge Required: Business users can query data without technical expertise
- Accurate Results: Semantic layer ensures consistent and governed data access
- Flexible LLM Integration: Support for multiple AI model providers
- Self-Hosted: Full control over your data and infrastructure
- Team Collaboration: Share insights and dashboards across your organization
- Cloud Data Warehouses: Snowflake, BigQuery, Redshift, Azure Synapse
- Relational Databases: PostgreSQL, MySQL, SQL Server, Oracle
- Analytics Engines: DuckDB, Trino, ClickHouse, Athena
- And more: Extensible architecture for custom connectors
- OpenAI (GPT-4, GPT-3.5)
- Azure OpenAI
- Google Gemini (AI Studio & Vertex AI)
- Anthropic Claude (API & Bedrock)
- DeepSeek
- Groq
- Ollama (Local models)
- Databricks
See configuration examples for setup details.
- Docker & Docker Compose
- OpenAI API key (or other LLM provider)
- Minimum 4GB RAM
-
Clone the repository
git clone https://github.com/NexusQuantum/NQRust-Analytics.git cd NQRust-Analytics -
Configure environment
cd docker cp .env.example .envEdit
.envand configure the following:PROJECT_DIR: Set to your local project pathOPENAI_API_KEY: Your OpenAI API key (or configure alternative LLM)JWT_SECRET: Required - Set a secure random string for authenticationPG_URL: PostgreSQL connection string for the application database- OAuth settings (optional): Configure Google/GitHub OAuth if needed
-
Start services
docker-compose up -d
-
Run database migrations
# The migrations set up user management, dashboards, and sharing features cd ../analytics-ui npm install # or yarn install DB_TYPE=pg PG_URL="postgres://analytics:analytics123@localhost:5432/analytics" npx knex migrate:latest
-
Access the application
-
First-time setup
- Register a new account (first user becomes admin)
- Or login with OAuth if configured
- Create your first dashboard and start querying
See individual service README files for detailed development instructions:
NQRust-Analytics/
βββ analytics-ai-service/ # AI/LLM service (Python/FastAPI)
βββ analytics-ui/ # Web interface (Next.js/React)
βββ analytics-launcher/ # CLI launcher (Go)
βββ analytics-engine/ # Query engine (submodule)
βββ deployment/ # Kubernetes & deployment configs
βββ docker/ # Docker compose files
βββ docs/ # Documentation
-
Analytics UI (Next.js)
- User interface for data exploration
- Dashboard creation and management
- Connection management
-
Analytics AI Service (Python/FastAPI)
- Natural language processing
- SQL generation via LLM
- RAG (Retrieval-Augmented Generation) pipelines
-
Analytics Engine (Rust/Java)
- Query execution and optimization
- MDL (Semantic layer) processing
- Multi-database connectivity
-
Supporting Services
- Qdrant: Vector database for embeddings
- PostgreSQL: Metadata storage
User Query (Natural Language)
β
Analytics UI
β
Analytics AI Service (LLM Processing)
β
Analytics Engine (SQL Execution)
β
Data Source
β
Results + Visualizations
Edit docker/config.yaml to configure your LLM provider:
llm_provider:
type: openai # or azure_openai, gemini, anthropic, etc.
api_key: ${OPENAI_API_KEY}
model: gpt-4Configure data sources through the UI or via API. Supported authentication methods:
- Username/Password
- API Keys
- Service Account (for cloud providers)
- SSH Tunneling
We welcome contributions! Here's how to get started:
- β Star the repository
- π Report bugs via GitHub Issues
- π‘ Suggest features in Discussions
- π Read Contributing Guidelines
- π§ Submit pull requests
This project is licensed under the AGPL-3.0 License - see the LICENSE file for details.
Built with:
- Next.js & React
- FastAPI & Python
- Rust & Java
- PostgreSQL & Qdrant
- Docker & Kubernetes
- Issues: GitHub Issues
- Discussions: GitHub Discussions
Made with β€οΈ by NexusQuantum