
🎯 Your Intelligent Study Companion - Fully Offline AI-Powered Learning
This project is currently in early development. Features are being actively built and tested. Expect breaking changes and incomplete functionality.
StudyPal-AI is being built as the ultimate offline-first AI study assistant that combines cutting-edge artificial intelligence with complete privacy and data ownership. Our mission is to democratize access to personalized education technology while ensuring your learning data remains entirely under your control.
- Smart Explanations: Get clear, context-aware explanations for any topic
- Interactive Interface: Beautiful, responsive UI with real-time processing
- Offline AI: Powered by locally-running Llama models via Ollama
- Privacy-First: All processing happens on your device
- Document Processing: Upload and extract text from PDF files
- AI Summarization: Generate concise summaries of lengthy documents
- Chunk Analysis: Break down large documents into manageable sections
- Visual Interface: Modern drag-and-drop file upload experience
📚 Smart Study Plans (Coming Soon)
- Generate personalized study schedules based on your goals and timeline
- Adaptive planning that adjusts to your progress and performance
- Multi-subject support with intelligent cross-referencing
- Learning style optimization and study method recommendations
🧩 Interactive Quizzes (Coming Soon)
- AI-generated quizzes tailored to your study materials
- Multiple question formats (MCQ, True/False, Short Answer, Essays)
- Real-time feedback with detailed explanations
- Progress tracking and performance analytics
� Smart Notes (Coming Soon)
- Intelligent note-taking with AI-powered insights
- Automatic categorization, tagging, and organization
- Cross-reference connections between concepts
- Summary generation and key point extraction
🎯 Learning Analytics (Coming Soon)
- Comprehensive learning progress tracking
- Identify knowledge gaps and suggest focus areas
- Personalized learning recommendations
- Study session optimization and time management
- FastAPI - High-performance Python web framework
- LangChain - AI workflow orchestration and management
- Ollama - Local LLM model execution and management
- Python 3.8+ - Core application logic and AI integrations
- Next.js 15 - React framework with App Router and RSC
- TypeScript - Type-safe development experience
- Tailwind CSS - Modern utility-first CSS framework
- React 19 - Component-based UI architecture
- Privacy-First: All data processing happens locally
- Offline-Capable: Works without internet after initial setup
- Open Source: Transparent, community-driven development
- Performant: Optimized for speed and resource efficiency
# Required software
- Python 3.8+
- Node.js 18+
- Git
- Ollama (for AI functionality)# Clone the repository
git clone https://github.com/MananVyas01/StudyPal-AI.git
cd StudyPal-AIcd backend
pip install -r requirements.txt
python main.pyBackend will be available at http://localhost:8000
cd frontend
npm install
npm run devFrontend will be available at http://localhost:3000
# Install Ollama (visit https://ollama.ai for platform-specific instructions)
# Then pull a model:
ollama pull llama3.2
ollama serveStudyPal-AI/
├── 🔧 backend/ # FastAPI Application
│ ├── main.py # Application entry point
│ ├── ollama_chain.py # AI model integration
│ ├── routes/ # API endpoints
│ │ ├── summarize.py # PDF summarization
│ │ └── explain.py # Topic explanation
│ ├── utils/ # Utility modules
│ │ └── pdf_parser.py # PDF processing
│ └── requirements.txt # Python dependencies
├── 🎨 frontend/ # Next.js Application
│ ├── src/
│ │ ├── app/ # App Router pages
│ │ │ ├── page.tsx # Main application page
│ │ │ ├── layout.tsx # Root layout
│ │ │ └── globals.css # Global styles
│ │ └── components/ # React components
│ │ └── PDFUploader.tsx # PDF upload component
│ ├── package.json # Node.js dependencies
│ └── tailwind.config.ts # Tailwind configuration
├── 📄 docs/ # Documentation
├── 📋 LICENSE # MIT License
└── 📖 README.md # This file
cd backend
# Install dependencies in development mode
pip install -r requirements.txt
# Run with auto-reload for development
uvicorn main:app --reload --host 0.0.0.0 --port 8000
# Run tests
pytest tests/
# Format code
black . && isort .cd frontend
# Install dependencies
npm install
# Development server with hot reload
npm run dev
# Type checking
npm run type-check
# Build for production
npm run build
npm run start
# Linting and formatting
npm run lint
npm run lint:fixWe welcome contributions from developers of all skill levels! Here's how you can help:
- Use the issue tracker
- Provide detailed reproduction steps
- Include system information and error logs
- Check existing discussions
- Describe the feature's value and use cases
- Consider implementation complexity
# 1. Fork and clone the repository
git clone https://github.com/MananVyas01/StudyPal-AI.git
# 2. Create a feature branch
git checkout -b feature/amazing-new-feature
# 3. Make your changes and test thoroughly
npm run test # Frontend tests
pytest # Backend tests
# 4. Commit with clear, descriptive messages
git commit -m "feat: add amazing new feature with tests"
# 5. Push and create a Pull Request
git push origin feature/amazing-new-feature- Follow existing code style and conventions
- Write comprehensive tests for new features
- Update documentation for any API changes
- Ensure all tests pass before submitting PRs
# Using Docker Compose (recommended)
docker-compose up --build
# Or manually
# Terminal 1: Backend
cd backend && python main.py
# Terminal 2: Frontend
cd frontend && npm run dev# Frontend build
cd frontend
npm run build
npm run start
# Backend production
cd backend
pip install -r requirements.txt
uvicorn main:app --host 0.0.0.0 --port 8000| Component | Status | Completion |
|---|---|---|
| 🧠 Topic Explainer | ✅ Beta | 85% |
| 📄 PDF Summarizer | 🚧 Development | 60% |
| 📝 Smart Notes | 📋 Planned | 0% |
| 🧩 Interactive Quizzes | 📋 Planned | 0% |
| 📚 Study Plans | 📋 Planned | 0% |
| 🎯 Analytics | 📋 Planned | 0% |
# Backend tests
cd backend
pytest tests/ -v --cov=.
# Frontend tests
cd frontend
npm run test
npm run test:e2eThis project is licensed under the MIT License - see the LICENSE file for complete details.
- Ollama Team - For making local LLM deployment accessible
- LangChain - For powerful AI workflow orchestration
- Vercel Team - For Next.js and deployment platform
- FastAPI - For the excellent Python web framework
- Tailwind Labs - For the beautiful utility-first CSS
- **Tailwind CSS**: Modern, responsive styling - **React**: Component-based UI architecture
- Python 3.8+
- Node.js 18+
- Ollama installed and running locally
cd backend
pip install -r requirements.txt
python main.pyThe backend will be available at http://localhost:8000
cd frontend
npm install
npm run devThe frontend will be available at http://localhost:3000
- Install Ollama from ollama.ai
- Pull a model:
ollama pull llama2 - Start Ollama service:
ollama serve
StudyPal-AI/
├── backend/ # FastAPI backend
│ ├── main.py # FastAPI application entry point
│ ├── ollama_chain.py # Ollama + LangChain integration
│ ├── routes/ # API route handlers
│ ├── utils/ # Utility functions
│ ├── quiz/ # Quiz generation logic
│ ├── notes/ # Note management features
│ └── requirements.txt # Python dependencies
├── frontend/ # Next.js frontend
│ ├── src/
│ │ └── app/ # Next.js app router pages
│ ├── public/ # Static assets
│ ├── package.json # Node.js dependencies
│ └── tailwind.config.js # Tailwind configuration
├── .gitignore # Git ignore rules
├── LICENSE # MIT License
└── README.md # This file
cd backend
# Install dependencies
pip install -r requirements.txt
# Run development server with auto-reload
uvicorn main:app --reload --host 0.0.0.0 --port 8000cd frontend
# Install dependencies
npm install
# Run development server
npm run dev
# Build for production
npm run buildWe welcome contributions! Please see our contributing guidelines for more details.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama for local LLM capabilities
- LangChain for AI workflow orchestration
- FastAPI for the robust backend framework
- Next.js for the powerful frontend framework
- Tailwind CSS for beautiful, responsive styling
If you have any questions or need help getting started, please:
- Check the documentation
- Open an issue
- Join our community discussions
Built with ❤️ by MananVyas01 for learners everywhere