Skip to content

vertinski/strong-zero

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

29 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

alt text

πŸ“Ž strong.zero-0.1.9

Generative Multi Layer Perceptron πŸ€–


This is a simple Generative Multi Layer Perceptron (G-MLP) for time series learning and generation. Written using Numpy, and presenting many interesting concepts. The loss function is totally custom, and learning schedule is hard-coded for now. This will be updated.

Weights are initialized using He-et-al Initialization.

Training process involves adding uniform noise to training data, to help the model to generalize and increase robustness. Training also includes scheduled periodic learning rate pumps, which result in noticeable Loss decreases.

The Combing and Generation process

The generation process involves "data combing". Strong.Zero "combs" the available time series data using simple scan heads -- sparse data points sampled from existing data. Then it generates the next chunk of time series points, which it adds to existing data, and repeats the process with now updated dataset.

Input and output data can be tokenized.

alt text

Fig. 1

Training process

Training schedule consists of periodic Learning Rate increases and gradual Noise decrease. A little bit of added noise is left till the end of training. The noise used for data augmentation is uniform pseudo-random noise.

alt text

Fig. 2

βœ… For now you can train the model on a CPU (test data included in code) in a couple of minutes and see the example data generated.

🚫 The weight saving and loading needs to be repaired.

About

Generative Multi Layer Perceptron

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages