This repository contains my coursework for the Advanced Deep Learning course at the University of Trieste during the 2024–2025 academic year. The work spans theoretical foundations, empirical studies, and research in deep learning.
Two comprehensive assignments exploring fundamental concepts in machine learning and deep learning:
-
Challenge 1: Kernel Methods & Deep Learning Pipeline
- Focus: Hybrid unsupervised-supervised learning pipeline for FashionMNIST
- Techniques: Kernel PCA, clustering algorithms, SVM, FCN, CNN
- Key Insight: Comparison of dimensionality reduction methods and their impact on classification performance
-
Challenge 2: Neural Network Function Learnability
- Focus: Empirical study of neural network learning dynamics
- Techniques: Teacher-student framework, hierarchical vs non-hierarchical functions
- Key Insight: How model capacity and function structure affect learning outcomes
A research project investigating advanced deep learning architectures:
- Group Equivariant CNNs & Low Coherence MLPs
- G-CNNs: Implementation of group-equivariant convolutional networks exploiting data symmetries
- Low Coherence MLPs: Application of frame theory principles to neural network weight matrices
- Research Questions: Does group equivariance improve performance on symmetric datasets? Can low-coherence frames enhance MLP training?
- Comprehensive comparison of linear PCA vs kernel PCA (Gaussian, polynomial)
- Systematic evaluation of clustering algorithms for pseudo-label generation
- Multi-architecture classification comparison (SVM, FCN, CNN)
- Performance analysis across different supervision levels
- Teacher-student framework with under/equal/over-parameterized models
- Hierarchical vs non-hierarchical function learning comparison
- Residual network analysis with systematic variable sensitivity studies
- Statistical robustness through multiple random seed validation
- G-CNN Implementation: Group-equivariant convolutions for cyclic and dihedral groups
- Frame Theory Application: Low-coherence frame optimization for MLP weights
- Comprehensive Evaluation: Performance analysis on MNIST, Fashion-MNIST, CIFAR-10
- Visualization Tools: Kernel weight analysis and group transformation visualization
- Frameworks: PyTorch, PyTorch Lightning, TensorBoard
- Data Processing: Custom data loaders, dimensionality reduction techniques
- Analysis: Statistical evaluation, visualization, experiment tracking
- Configuration: YAML-based parameter management
- Reproducibility: Fixed random seeds, comprehensive logging
Each subdirectory contains detailed documentation and setup instructions:
- Challenges: See individual
Report_Summary.mdfiles for detailed analysis - Final Project: Comprehensive README with setup, usage, and results
- Challenge Reports: Detailed PDF reports with comprehensive analysis
- Project Slides: Presentation materials available in the final project directory
- Code Documentation: Well-documented implementations with usage examples
- Experimental Results: Plots, logs, and analysis scripts for reproducibility