This is a really simple implementation of a neural network with x inputs, 2 hidden layers and one output written in java. It uses sigmoid or ReLU as activation function.
If you are new to NNs this can help you understand forward- and backpropagation.
-
Notifications
You must be signed in to change notification settings - Fork 2
GraxCode/EasyNeuralNetwork
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
Simple implementation of NNs in Java
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published