Skip to content

Zeev1988/AI-Interviewer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI Interviewer

POC showcasing: Data science questions, RAG context with Chroma, hybrid grading (LLM + rule-based keypoints), and a chat UI.

Features

  • RAG over seeded docs (Chroma + MiniLM embeddings)
  • Hybrid grading: LLM returns scores + summary; rule-based coverage uses embeddings vs. keypoints.
  • Chat-style UI (Streamlit) with running scores and citations

Stack / tools

  • FastAPI: backend API and routing
  • Streamlit: chat-style frontend
  • ChromaDB: persistent vector store for RAG
  • sentence-transformers (MiniLM): embeddings for retrieval and coverage
  • OpenAI API: grading and keypoint extraction
  • Uvicorn: ASGI server with reload in dev
  • Pydantic / pydantic-settings: config and schemas

Quick start

Prereqs: Python 3.10+ and pip. (Docker optional)

docker compose up --build

Then open:

Models

  • Embeddings: sentence-transformers/all-MiniLM-L6-v2 (CPU-friendly)
  • LLM: OpenAI (gpt-4o-mini) if OPENAI_API_KEY present

Environment

Create .env in repo root (optional):

OPENAI_API_KEY=sk-...
OPENAI_MODEL=gpt-4o-mini

# App
APP_ENV=dev
LOG_LEVEL=INFO

Run locally (without Docker)

python3 -m venv .venv && source .venv/bin/activate
pip install -r backend/requirements.txt
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000 --app-dir backend

In another terminal:

source .venv/bin/activate
pip install -r frontend/requirements.txt
streamlit run frontend/streamlit_app.py --server.port 8501

Debug RAG docs:

curl "http://localhost:8000/interview/debug/rag/ds_theory?limit=5"

Roadmap

  • Phase 2: speech (Whisper), Langfuse tracing, LoRA evaluator, deployment to Fly.io.

About

POC AI Interviwer - RAG with Gradings

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published