Skip to content
#

stochastic-gradient-descent

Here are 10 public repositories matching this topic...

📶 Logistic regression classifier for bit decoding in binary vectors using stochastic gradient descent (SGD). Features performance evaluation, probabilistic modeling, confusion matrix analysis, and classification error interpretation. Developed in Python with Jupyter Notebook.

  • Updated Aug 10, 2025
  • Jupyter Notebook

I build the Micrograd autogradient engine, which is a functioning neural network with forward pass, backward propagation, and stochastic gradient descent, all built from scratch. This is derived from the great @karpathy micrograd lecture. Each notebook is complete with Andrei's lecture code and speech, as well as my own code, anecdotes and addition

  • Updated Jul 21, 2024
  • Jupyter Notebook

Portfolio of Jupyter Notebooks demonstrating various ML models/concepts learned and developed during my graduate machine learning course and independently post-grad. Generally, a bottom-up modeling approach with Numpy is used to showcase grasp of mathematical foundation. Higher-level libraries (scikit-learn) used for optimized algo implementations.

  • Updated Jul 25, 2025

Your all-in-one Machine Learning resource – from scratch implementations to ensemble learning and real-world model tuning. This repository is a complete collection of 25+ essential ML algorithms written in clean, beginner-friendly Jupyter Notebooks. Each algorithm is explained with intuitive theory, visualizations, and hands-on implementation.

  • Updated Jul 22, 2025
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the stochastic-gradient-descent topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the stochastic-gradient-descent topic, visit your repo's landing page and select "manage topics."

Learn more