Skip to content

Applying Gradient Descent from scratch and analyzing the effect of changing the number of epochs and learning rate on the mean square error (MSE).

Notifications You must be signed in to change notification settings

ahany42/GD-Optimization-Insights

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Gradient Descent Implementation

This project implements Gradient Descent from scratch to analyze how different hyperparameters—number of epochs and learning rate—affect the Mean Square Error (MSE).

🚀 Overview

Gradient Descent is an optimization algorithm used to minimize the loss function in machine learning models. This project:

  • Implements Gradient Descent without external ML libraries.
  • Compares different learning rates and epoch values.
  • Evaluates the impact on Mean Square Error (MSE).

📌 Features

  • Custom implementation of Gradient Descent.
  • Adjustable hyperparameters: epochs and learning rate.
  • Visual analysis of MSE convergence.

About

Applying Gradient Descent from scratch and analyzing the effect of changing the number of epochs and learning rate on the mean square error (MSE).

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages