Skip to content

Alexander-Wen/basic-neural-network

Repository files navigation

Basic Neural Network

Basic neural network with one hidden layer. Network takes in 2 inputs, and generates one continuous output.

Each synpase uses a sigmoid activation function.

Gradient descent is done by a BFGS algorithm

Steps to run locally

  1. clone the repo locally by running the command git clone https://github.com/Alexander-Wen/basic-neural-network

TODO

  • add more layers
  • create real training data
  • create real test data
  • create class to determine accuracy of data
  • fix that one bug

Note

The training data and testing data are randomly generated, so the results will be highly incorrect.

About

basic neural network with one hidden layer

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages