Basic neural network with one hidden layer. Network takes in 2 inputs, and generates one continuous output.
Each synpase uses a sigmoid activation function.
Gradient descent is done by a BFGS algorithm
- clone the repo locally by running the command
git clone https://github.com/Alexander-Wen/basic-neural-network
- add more layers
- create real training data
- create real test data
- create class to determine accuracy of data
- fix that one bug
The training data and testing data are randomly generated, so the results will be highly incorrect.