Implementation of Artificial Neural Networks using NumPy
-
Updated
Jun 19, 2023 - Python
Implementation of Artificial Neural Networks using NumPy
A look at some simple autoencoders for the Cifar10 dataset, including a denoising autoencoder. Python code included.
Looking at the manifold hypothesis in deep learning. Creating a simple spiral dataset allows me to reveal how neural networks follow an optimal packing strategy during their training.
Threat Detection System using Hybrid (Machine Learning + Lexical Analysis) learning Approach.
An neural network to classify the handwritten digits 0-9 for the MNIST dataset. No NN/ML libraries used.
Neural Network to predict which wearable is shown from the Fashion MNIST dataset using a single hidden layer
A neural network (NN) having two hidden layers is implemented, besides the input and output layers. The code gives choise to the user to use sigmoid, tanh orrelu as the activation function. Prediction accuracy is computed at the end.
CNN Deep Layer Filters Visualization using Tensorflow.
Python neural network built from scratch. Uses Machine Learning algorithms to correctly classify handwritten numbers into digits.
This project is build up completely with numpy. It implements basic neural network concepts including backpropagation, hidden layers, activation function and gradient descent.
A implementation of a Neural Network in vanilla python that trains on the MNIST handwritten digit classifiction problem.
Implementation of Neural Style Transfer algorithm with PyTorch library
Python implementation that explores how different parameters impact a single hidden layer of a feed-forward neural network using gradient descent
Implementing a 2-class Classification Neural Network with a Single Hidden Layer
# Fracture.v1i_Reduced_SSD From a selection of data from the Roboflow file https://universe.roboflow.com/landy-aw2jb/fracture-ov5p1/dataset/1, which represents a reduced but homogeneous version of that file, try to perform fracture detection using SSD. A version with VGG16 and another with only linear layers are presented
Feed Forward Neural Network to classify the FB post likes in classes of low likes or moderate likes or high likes, back propagtion is implemented with decay learning rate method
Neural Networks scratch
Implements Back propagation algorithm for multi-layer perceptron in incremental mode
This code implements neural network from scratch without using any library
Add a description, image, and links to the hidden-layers topic page so that developers can more easily learn about it.
To associate your repository with the hidden-layers topic, visit your repo's landing page and select "manage topics."