classicsml : Classsic Supervised ML # Cost Function # Gradient Descent
- Implement supervised learning
- Minimized error with cost function
- SSD (sum of squared differences)
- SAD (sum of absolute differences)
- Optimized model using gradient descent
- OLS (ordinary least squares)
- Experiment with learning rates (how they affect training)





