Skip to content

michjord0001/Multi-Layer-Perceptron

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 

Repository files navigation

Multi-Layer-Perceptron

Self-Coded Multi-Layer Perceptron without use of available neural network/connectionist/machine learning/... libraries.

This software is completes the following:

  • Create a new MLP with any given number of inputs, any number of outputs (can be sigmoidal or linear), and any number of hidden units (sigmoidal/tanh) in a single layer.
  • Initialise the weights of the MLP to small random values
  • Predict the outputs corresponding to an input vector
  • Implement learning by backpropagation

Testing:

  1. Train an MLP with 2 inputs, two hidden units and one output on the following examples (XOR function): ((0, 0), 0) ((0, 1), 1) ((1, 0), 1) ((1, 1), 0)
  2. At the end of training, check if the MLP predicts correctly all the examples.

In case GitHub can't render the notebook ("Sorry, something went wrong. Reload?"): https://nbviewer.jupyter.org/github/michjord0001/Multi-Layer-Perceptron/blob/master/MultiLayerPerceptron.ipynb

About

Self-Coded Multi-Layer Perceptron

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published