Skip to content

RobertFrenette/Code_Analysis_Agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Code Analysis Agent (Vibe coded in Cursor)

This repository provides a full-stack solution for generating narrated code analysis using Python, FastAPI, Flask, and Ollama models.

The sample code files are from the CSCI_E-89_Deep_Learning_Fall_25 Repo which is a Local Multimodal Hybrid RAG Agent.

Demo Video

Code Analysis Agent Demo

Project Structure

  • app/
    • backend/: FastAPI service for code analysis and narration generation (Ollama integration)
    • frontend/: Flask app for file browsing, code display, and narration playback
    • README.md: App-level documentation
  • data/
    • src/: Source code files that can be analyzed (browsed via frontend)
      • agent/: Agent logic and processors
      • db/: Database creation and seeding scripts
      • vectorstore/: Vector store utilities
  • shutdown.sh / startup.sh: Scripts to start and stop the application

Features

  • Browse and select Python files from data/src/ via the frontend
  • Backend analyzes code and generates narration using Ollama models
  • REST API endpoints for code analysis and narration
  • Frontend displays code with syntax highlighting, summary, and narration playback
  • Audio narration generation using gTTS

Screenshots

Screenshot 1 Screenshot 2

Getting Started

  1. Install Python 3.12+ and Ollama
  2. Set up Ollama (see Ollama Setup)
  3. Run startup.sh from the project root to:
    • Automatically set up .env files from sample.env.txt files (if they don't already exist)
    • Install dependencies for both backend and frontend
    • Launch both backend and frontend services
  4. Open the frontend at http://localhost:5173 and select a Python file from the file browser
  5. See individual README.md files in app/backend/ and app/frontend/ for more details

Environment Configuration

The startup.sh script automatically creates .env files from sample.env.txt templates:

  • app/backend/.env - Backend configuration (OLLAMA_API_URL, FRONTEND_URL)
  • app/frontend/.env - Frontend configuration (BACKEND_URL, PORT, DEBUG)

If .env files already exist, they won't be overwritten. You can customize these files as needed.

Setup Scripts

  • startup.sh:
    • Sets up .env files from sample.env.txt templates (if they don't exist)
    • Installs dependencies for both backend and frontend
    • Launches backend and frontend services
  • shutdown.sh: Stops all running services and cleans up temporary files

Documentation

About

An AI Code Analysis Agent built with FastAPI and Flask, and uses a local LLM running on Ollama.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published