A fully-featured, locally-hosted AI chat application with multi-user support, powered by Ollama and PostgreSQL.
- 🤖 AI Chat - Powered by Ollama (qwen2.5:1.5b or any model)
- 💬 Multi-User Support - PostgreSQL-based authentication and data isolation
- 🎨 Modern UI - Dark theme with Three.js particle background
- 📝 Markdown Support - Full markdown rendering with code syntax highlighting
- 🔄 Streaming Responses - Real-time AI responses
- 💾 Conversation Management - Create, rename, delete conversations
- 🔐 Secure Authentication - Session-based auth with bcrypt
- 📱 Responsive Design - Works on desktop, tablet, and mobile
- Node.js & Express.js
- PostgreSQL
- Ollama (AI models)
- Session-based authentication
- React 18 with Vite
- Tailwind CSS
- Zustand (state management)
- Three.js (3D background)
- React Markdown
- Node.js (v18 or higher)
- PostgreSQL (v14 or higher)
- Ollama (with at least one model installed)
curl -fsSL https://ollama.com/install.sh | shDownload from https://ollama.com/download
# Recommended lightweight model
ollama pull qwen2.5:1.5b
# Or other models
ollama pull llama2
ollama pull mistral
ollama pull codellamaVerify Ollama is running:
ollama listcd think_ai
# Install backend dependencies
cd server
npm install
# Install frontend dependencies
cd ../client
npm installCreate a PostgreSQL database:
# Login to PostgreSQL
psql -U postgres
# Create database
CREATE DATABASE ai_chat;
# Create user (optional)
CREATE USER ai_chat_user WITH PASSWORD 'your_password';
GRANT ALL PRIVILEGES ON DATABASE ai_chat TO ai_chat_user;
# Exit
\qCreate .env file in the server directory:
cd server
cp .env.example .envEdit .env with your configuration:
# Server Configuration
PORT=3001
NODE_ENV=development
# Database Configuration
DB_HOST=localhost
DB_PORT=5432
DB_NAME=ai_chat
DB_USER=postgres
DB_PASSWORD=your_password_here
# Session Secret (change this!)
SESSION_SECRET=change-this-to-a-random-secret-key
# Ollama Configuration
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=qwen2.5:1.5b
# CORS
CORS_ORIGIN=http://localhost:5173cd server
npm run migrateYou should see:
✓ Users table created
✓ Conversations table created
✓ Messages table created
✓ Generated files table created
✓ Settings table created
✓ Indexes created
✓ Triggers created
✅ All migrations completed successfully
cd server
npm run devYou should see:
╔═══════════════════════════════════════════╗
║ ║
║ AI Chat Server ║
║ ║
║ Server running on port 3001 ║
║ Environment: development ║
║ ║
╚═══════════════════════════════════════════╝
✓ Database connected successfully
✓ Ollama connected successfully
✓ Available models: qwen2.5:1.5b
cd client
npm run devYou should see:
VITE v5.0.8 ready in 500 ms
➜ Local: http://localhost:5173/
➜ Network: use --host to expose
Navigate to http://localhost:5173
-
Register an Account
- Click "Sign Up" on the login page
- Enter username, email, and password
- Submit to create your account
-
Start Chatting
- Click "New Chat" in the sidebar
- Type your message in the input box
- Press Enter or click Send
- Watch the AI respond in real-time!
- New Conversation: Click the "New Chat" button
- Rename: Click on the conversation title to edit
- Delete: Hover over a conversation and click the delete icon
- Switch: Click any conversation to load it
- Send: Type and press Enter (Shift+Enter for new line)
- Regenerate: Click regenerate icon on AI messages
- Copy: Click copy icon to copy message content
- Delete: Click delete icon to remove a message
Enter- Send messageShift + Enter- New line in messageCtrl/Cmd + N- New conversation (when implemented)
POST /api/auth/register- Register new userPOST /api/auth/login- LoginPOST /api/auth/logout- LogoutGET /api/auth/me- Get current user
GET /api/conversations- List all conversationsPOST /api/conversations- Create conversationGET /api/conversations/:id- Get conversationPUT /api/conversations/:id- Update conversationDELETE /api/conversations/:id- Delete conversationGET /api/conversations/:id/messages- Get messages
POST /api/conversations/:id/messages- Send message (supports streaming)POST /api/conversations/messages/:id/regenerate- Regenerate messageDELETE /api/conversations/messages/:id- Delete message
think_ai/
├── client/ # React frontend
│ ├── src/
│ │ ├── components/ # React components
│ │ │ ├── Chat/ # Chat UI components
│ │ │ ├── Sidebar/ # Sidebar components
│ │ │ └── ThreeBackground/ # 3D background
│ │ ├── store/ # Zustand store
│ │ ├── utils/ # API utilities
│ │ └── App.jsx # Main app
│ └── package.json
│
├── server/ # Node.js backend
│ ├── src/
│ │ ├── config/ # Database config
│ │ ├── controllers/ # Route controllers
│ │ ├── services/ # Ollama service
│ │ ├── middleware/ # Express middleware
│ │ ├── routes/ # API routes
│ │ └── server.js # Main server
│ └── package.json
│
└── storage/ # File storage
├── uploads/
├── generated/
└── temp/
Error: connect ECONNREFUSED
Solution: Make sure PostgreSQL is running:
# Linux/Mac
sudo service postgresql start
# Check status
sudo service postgresql status⚠ Ollama not connected
Solution: Start Ollama:
ollama serveIn another terminal, verify models:
ollama listError: listen EADDRINUSE: address already in use :::3001
Solution: Change the port in server/.env:
PORT=3002Or kill the process using the port:
# Find process
lsof -i :3001
# Kill it
kill -9 <PID>Access to fetch blocked by CORS policy
Solution: Update CORS_ORIGIN in server/.env:
CORS_ORIGIN=http://localhost:5173To access your chat from outside your network:
- Install ngrok:
# Linux/Mac
brew install ngrok
# Or download from https://ngrok.com/download- Start ngrok:
ngrok http 3001- Update frontend API calls to use ngrok URL
- Use a lightweight model for faster responses (qwen2.5:1.5b, phi, etc.)
- Limit conversation history for better performance
- Use SSD storage for PostgreSQL for faster queries
- Allocate enough RAM (4-8GB recommended for model + app)
cd server
npm run dev # Auto-restart on changescd client
npm run dev # Hot reload enabledTo run migrations again:
cd server
npm run migrate- Change
SESSION_SECRETin.envto a random secure key - Use HTTPS in production
- Set
NODE_ENV=production - Enable PostgreSQL SSL
- Use environment-specific
.envfiles - Never commit
.envfiles to version control
MIT
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
For issues and questions:
- Check the troubleshooting section
- Review Ollama documentation: https://ollama.com
- Check PostgreSQL logs
- File upload support (images, PDFs, documents)
- Voice input/output (TTS/STT)
- Image generation
- Export conversations (PDF, DOCX)
- Advanced settings (temperature, max tokens)
- Model switching in UI
- Conversation search
- Message editing
- Dark/Light theme toggle
Built with:
- Ollama - Local AI models
- React - UI framework
- Three.js - 3D graphics
- Tailwind CSS - Styling
- Express - Backend framework
- PostgreSQL - Database