Recipe for simple GAN in Golang ecosystem via Gorgonia library
Generating f(x) = x^2 | Generating f(x) = sin(x) |
---|---|
This just a repository with simple example how to build GAN in Gorgonia
What is GAN?
- Short: such networks are just two neural networks (Discriminator and Generator) contesting each other. Generator must "cheat" on Discriminator and the last one should detect lies.
- Long: Wiki-page about GAN's - https://en.wikipedia.org/wiki/Generative_adversarial_network
Note: although there is code with some wrappings/aliasing and helping abstractions and functions in repository, this does not pretend to be high-level machine learning framework
Note #2: By the way... Code is ugly since I've decided to handle errors instead of using panic(...) calls. Panicing is considered to be in main functions of examples only
Current examples folder contains limited set of layer types:
- Linear
- Convolutional
- Maxpool
- AvgPool
- Flatten
- Reshape
- Reshape
- Dropout
- Embedding
- LSTM
Just want to do that in Golang ecosystem.
Code is written on Golang - https://golang.org/
Used machine learning library - Gorgonia
Plotting library - gonum
-
Get the repo
git clone https://github.com/LdDl/gan-go.git
-
Navigate to examples folder
cd gan-go cd cmd/examples
-
Pick one of examples. E.g. parabola:
cd parabola
-
Run example
go run main.go
-
Output
After programm terminates there should be multiple files:
- Single image for reference function - reference_function.png
- Multiple images for generated data on N-th epoch - gen_reference_fun_%N-th epoch%.png
- Single image for generated data on last epoch - gen_reference_func_final.png
Example for parabola:
Actual reference function:
Generated data on 0-th epoch:
Generated data on 10-th epoch:
Generated data on 60-th epoch:
Generated data on 150-th epoch:
Generated data on last epoch:
Current stage of TODO list for future releases:
- Reduce duplicating of code for .Fwd() method of each neural network type (GAN/Discriminator/Generator)
- Switch Layer from struct to interface or use other technique for building clean code
- Add basic layers: Linear, Convolutional, Maxpool, Flatten
- Deal with batch process
- More loss function
- Cross Entropy
- Binary Cross Entropy
- L1
- Huber (PSEUDO)
- Examples for text data generation WIP
- Simple LSTM
- Proper layer types for RNN WIP
- Examples
- RNN
- GRU
- Embedding
@TODO
If you have troubles or questions please open an issue.
If you want to improve this library / fix some bugs and etc. you can make PR