This repository provides a full-stack solution for generating narrated code analysis using Python, FastAPI, Flask, and Ollama models.
The sample code files are from the CSCI_E-89_Deep_Learning_Fall_25 Repo which is a Local Multimodal Hybrid RAG Agent.
app/backend/: FastAPI service for code analysis and narration generation (Ollama integration)frontend/: Flask app for file browsing, code display, and narration playbackREADME.md: App-level documentation
data/src/: Source code files that can be analyzed (browsed via frontend)agent/: Agent logic and processorsdb/: Database creation and seeding scriptsvectorstore/: Vector store utilities
shutdown.sh/startup.sh: Scripts to start and stop the application
- Browse and select Python files from
data/src/via the frontend - Backend analyzes code and generates narration using Ollama models
- REST API endpoints for code analysis and narration
- Frontend displays code with syntax highlighting, summary, and narration playback
- Audio narration generation using gTTS
- Install Python 3.12+ and Ollama
- Set up Ollama (see Ollama Setup)
- Run
startup.shfrom the project root to:- Automatically set up
.envfiles fromsample.env.txtfiles (if they don't already exist) - Install dependencies for both backend and frontend
- Launch both backend and frontend services
- Automatically set up
- Open the frontend at http://localhost:5173 and select a Python file from the file browser
- See individual README.md files in
app/backend/andapp/frontend/for more details
The startup.sh script automatically creates .env files from sample.env.txt templates:
app/backend/.env- Backend configuration (OLLAMA_API_URL, FRONTEND_URL)app/frontend/.env- Frontend configuration (BACKEND_URL, PORT, DEBUG)
If .env files already exist, they won't be overwritten. You can customize these files as needed.
startup.sh:- Sets up
.envfiles fromsample.env.txttemplates (if they don't exist) - Installs dependencies for both backend and frontend
- Launches backend and frontend services
- Sets up
shutdown.sh: Stops all running services and cleans up temporary files
- Ollama Setup: How to install and configure Ollama (see this file in the project root)
- Backend README: Backend API details
- Frontend README: Frontend usage details

