A collection of experiments covering fundamental Deep Learning concepts, from implementing a Perceptron ,Forward-Pass, Backpropagation from scratch using NumPy to Keras for implementations of MLP image classifiers and LSTM time-series prediction.
-
Perceptron Implementation (AND Gate): Implementation of the classic Perceptron learning algorithm to solve the AND logical gate problem, showing the weight and bias updates step-by-step.
-
Core Components: Activation and Loss: Analyze and implement fundamental functions crucial for training, including Activation Functions and Loss Functions .
-
The Backpropagation Algorithm (From Scratch): Core implementation of the backpropagation algorithm for a multi-layered Neural Network using NumPy, demonstrating how weights and biases are updated to minimize loss over epochs.
-
Foundational Architecture from Scratch: Explore the mechanics of an Artificial Neural Network by manually implementing a basic Feedforward Neural Network and its forward pass and loss calculation using only NumPy.
-
Comparative CNN Analysis (MNIST): Investigating and comparing different Convolutional Neural Network (CNN) architectures (Basic CNN, Varying Layers, and LeNet-5) for handwritten digit classification to analyze their training dynamics and performance metrics.
-
Model Optimization and Tuning: Conduct an experiment to compare the performance and convergence of various optimizers to identify the most efficient method for a specific model.
-
Sequential Data Prediction (LSTM): Implement an advanced Long Short-Term Memory (LSTM) Recurrent Neural Network for time-series forecasting, demonstrated through a practical example of stock price prediction.