Skip to content

its-nott-me/Linear-Regression

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

📈 Linear Regression from Scratch

This project implements Linear Regression from scratch in Python, without relying on scikit-learn’s ready-made models.
It explores different optimization methods, compares their performance, and visualizes convergence and predictions.

👉 Open in Google Colab


🚀 Implemented Methods

  • Closed-form solution (Normal Equation)
  • Gradient Descent (GD)
  • Stochastic Gradient Descent (SGD)
  • Momentum-based Gradient Descent
  • Adam Optimizer

📊 Features

  • Custom StandardScaler for data normalization
  • Metrics: MSE, RMSE, MAE, R² score
  • Visualization utilities:
    • Loss curve comparison
    • Prediction vs. actual plots
    • Residual analysis
    • Convergence rate (log-scale)
    • Step-by-step gradient descent visualization

🖼️ Example Outputs

Loss Curves Comparison

Loss Curves

Predictions Comparison

Predictions

Gradient Descent Steps

GD Steps


⚙️ Getting Started

Clone the repo and open the notebook:

git clone https://github.com/yourusername/linear-regression-scratch.git
cd linear-regression-scratch
jupyter notebook LinearRegression.ipynb

Or run directly on Colab: 👉 Colab Link

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors