Skip to content

AET-DevOps25/team-closed-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸš€ Team ClosedAI

This project is part of the DevOps course at TUM. A collaborative team effort integrating server, client, and AI components with DevOps best practices.

πŸ“‹ Project Management Application

Our application helps teams organize projects with an intuitive kanban interface ✨ and AI-powered chat assistant πŸ€–. The system can analyze project context to suggest new tasks or answer questions, making project management simpler for everyone from experienced managers to students.

Use Scenarios 🎭

πŸ“š For detailed use cases and requirements, see our Problem Statement

  1. Team Setup πŸ‘₯: A student team creates a project and builds their task backlog with our kanban board.

  2. New Team Member πŸ†•: Bob asks the AI about the project and receives personalized task recommendations matching his React skills.

  3. Solo Developer πŸ‘¨β€πŸ’»: Charlie uploads project requirements and the AI generates a complete task backlog in one click.

Use Case Diagram

System Architecture πŸ—οΈ

πŸ“š For detailed architectural information, see our System Overview

Microservice architecture for seamless project management:

Component Diagram

Components

  • Client πŸ’»: React-based interface for project management and chat.
  • Server πŸ–₯️: Java Spring Boot microservices:
    • User Service: User management
    • Project Service: Project organization
    • Task Service: Task management
  • GenAI Service 🧠: Python-based intelligent chat assistant using large language models.

The system uses PostgreSQL with pgvector and pgai for storage and semantic search.

Object Model 🧩

Object Model

GenAI Implementation πŸ”¬

πŸ“š For more details, see GenAI Documentation

GenAI Workflow

Our AI classifies user input to generate tasks or answer questions with contextual relevance.

Our AI service uses:

  • LangChain for orchestrating AI workflows
  • Ollama for running large language models locally
  • pgvector for semantic search with vector embeddings
  • pgai for managing embeddings and vectorization of project data
  • RAG (Retrieval-Augmented Generation) for context-aware responses
  • Intent Classification to distinguish between task generation and question answering

The system automatically embeds project and task data, enabling intelligent recommendations and contextual answers.

Setup & Development πŸ› οΈ

Local Development

# 1. Clone repository
git clone https://github.com/AET-DevOps25/team-closed-ai.git && cd team-closed-ai

# 2. Setup environment
cp genai/.env.example genai/.env
# Edit the .env file to add your API keys

# (Optional: Set up Discord Alerting Webhook)
# Replace DISCORD_WEBHOOK_URL with the actual URL in ./monitoring/grafana-contact-points.yml

# 3. Launch all services
docker-compose up

Access at http://localhost

This setup includes:

  • Client application
  • Server APIs at various endpoints (users, projects, tasks)
  • PostgreSQL database
  • Ollama LLM server
  • GenAI service with vector database support
  • Monitoring and Observability with Grafana and Prometheus

API Documentation πŸ“–

Interactive Swagger UI documentation is available for all services:

πŸ’‘ All services must be running via docker compose up to access the documentation.

Tests βœ…

# Run tests for each component
cd server && ./gradlew test
cd client && npm test
cd genai && python -m pytest
# Run tests with coverage verification
cd server && ./gradlew test jacocoTestCoverageVerification
cd client && npm run test:coverage
cd genai && python -m pytest

Deployment 🚒

AWS Deployment ☁️

Our AWS deployment uses Terraform and Ansible with Docker Compose:

Run the following command to setup the resources on AWS:

cd terraform && terraform apply -auto-approve

Run the following command to deploy the application (make sure to setup your environment variables first):

cd ansible && ansible-playbook -i inventory.yml playbook.yml --extra-vars "ansible_host=<your-aws-instance-ip>"

You can also run the GitHub Action for deploying to AWS.

Kubernetes Deployment 🌐

For Kubernetes deployment we use Helm charts. You can access our application deployed on Rancher here: https://closed-ai.student.k8s.aet.cit.tum.de

If you want to deploy our chart in another namespace or cluster you have to manually create two secrets to be able to pull the docker images.

ghcr-secret: ghcr.io username and token for pulling the project's own images. dockerhub-secret: Dockerhub username and password for pulling the projects' dependencies' images.

These secrets are needed if the anonymous image pull-limit is exceeded, otherwise you can just remove these references from the template files under .helm/closed-ai/templates.

CI/CD Pipeline πŸ”„

The project uses GitHub Actions for:

  • Automated testing
  • Linting
  • Building Docker images
  • Deployment to AWS
  • Deployment to Rancher

Monitoring πŸ“Š & Alerting 🚨

The project includes monitoring with Prometheus for metrics collection and Grafana for visualization:

  • Grafana: Dashboard (admin/admin)
    • Business Overview: Application metrics, user/project/task statistics
    • Kubernetes Overview: Cluster monitoring
  • Prometheus: Metrics scraping and storage

Dashboards are automatically provisioned from the monitoring/dashboards/ directory.

You can join the ClosedAI's Discord Server to get updates when alerts are fired: https://discord.gg/uxbsnNjY.

πŸ“š For detailed monitoring, setup, configuration and alerting, see our Monitoring Guide

Team Members πŸ‘¨β€πŸ’»

Team Member Role
@SimonKaran13 Server Developer
@jakobkoerber Client Developer
@DominikRemo GenAI Developer

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors 5