Skip to content

NirBinyamin8/Machine-learning---Neural-network

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

5 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🧠 Deep Dive into Neural Connectivity: Crafting a Full-Fledged Feed-forward Neural Network πŸš€

Unveil the intricacies of neural networks! In this venture, we embark on designing a fully connected, feed-forward neural network, an architecture where each neuron establishes connections with every single neuron in its succeeding layer. With the renowned MNIST dataset πŸ“Š, known for its collection of handwritten digits ✏️, as our backdrop, we push the boundaries of pattern recognition and machine learning.

Neural Network Characteristics πŸ› 

Fully Connected Layers: A setup where every neuron connects to all neurons in the subsequent layer. Dataset: The acclaimed MNIST dataset serving as our primary training and testing ground. Optimization: Embracing Gradient Descent βš™οΈ Gradient Descent (GD), a first-order iterative optimization technique, lies at the heart of our project. Used to minimize the cost function, GD ensures our neural network continuously refines its weights and biases, enhancing accuracy as iterations progress. This initiative underscores not just the creation of a neural network, but also the paramount significance of optimization in elevating its performance.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages