GoS_AI_Project is a deep learning system combining image classification and object detection. Designed for smart educational or security applications, this project uses powerful neural networks to recognize and detect objects in images, with a focus on deployment efficiency and accuracy.
- ✅ Image classification using pretrained CNN models (ResNet, VGG16, MobileNet)
- 🎯 Image Classification with custom datasets
- 📊 Performance metrics visualization (Accuracy, F1-Score, etc.)
- 🧠 Future-ready with support for self-supervised learning and edge deployment
- Angular (for future real-time monitoring dashboard or visual results display)
- Django (REST API for serving predictions and managing models)
- TensorFlow / Keras
- PyTorch
- OpenCV – Image manipulation
- Matplotlib, Seaborn – Visualization
- Pandas, NumPy – Data manipulation
- Scikit-learn – Evaluation metrics
- Flask / FastAPI – Lightweight deployment
- Unsloth / LoRA – LLM fine-tuning
- Ollama – Advanced language model fine-tuning and inference
- Node.js (v18+)
- Python (v3.10+)
- Django (v4+)
- Angular CLI (v15+)
- Clone the repository
git clone https://github.com/RawCooked/GoS_AI_Project.git
cd GoS_AI_Project- Backend Setup
cd backend
python -m venv venv
source venv/bin/activate # On Windows, use `venv\Scripts\activate`
pip install -r requirements.txt
python manage.py migrate
python manage.py runserver- Frontend Setup
cd ../frontend
npm install
ng serve- Access the App
Open
http://localhost:4200in your browser for the frontend andhttp://localhost:8000for the backend.
This is how this repo is organized:
/dataset
/Act
/Artistical-talent-detection
/Datasets
/Notebooks
/Mathematical-logical-thinking
/Notebooks
/Singing-talent-detection
/Notebooks
/Datasets
/Audio-Preview
/Engage
/Investigate
- 📚 Self-supervised learning for semi-labeled datasets
- ⚡ Real-time optimization (quantization, pruning)
- 🧠 Edge deployment on Raspberry Pi
- 🔋 Energy-efficient architectures
- Inspired by Our Dear Professors (❁´◡`❁) & Personal Experiences
- Special thanks to ESPRIT School of Engineering for their continuous support and guidance