🧠 Made of Code: Machine Learning Regression with Gradient Descent and Stochastic Optimization
-
Updated
Jun 16, 2025 - Jupyter Notebook
🧠 Made of Code: Machine Learning Regression with Gradient Descent and Stochastic Optimization
Large Scale Machine/Deep Learning library for Python
How to build a simple neural network from scratch using Numpy and linear algebra without relying on high-level libraries like TensorFlow or Keras.
Animating how Adaline classification works by minimizing cost. Showing comparison of three kinds of gradient descent.
Generalized local search tool
Linear regression and Normal equation implementation of predicting the life expectancies in different countries.
Practice on ML Specialization by Deeplearning.ai
Rust implementation of the Adaline artificial neural network algorithm for educational purposes.
Step-by-Step Guide to an Optimization Problem Solver in Scala
Predicting House Sale Prices with Machine Learning
Uses Matrix factorisation on a sparse matrix to predict the missing values of rating of movies by users using stochastic and batch gradient descent.
This repository contains numpy implementations of different ML Algorithms
Multiclass Logistic, Classification Pipeline, Cross Validation, Gradient Descent, Regularization
Logistic regression model for breast cancer prediction using imaging features, achieving 98.24% accuracy on Kaggle dataset without vectorization.
Add a description, image, and links to the gradientdescent topic page so that developers can more easily learn about it.
To associate your repository with the gradientdescent topic, visit your repo's landing page and select "manage topics."