Skip to content

Experiments and code for our paper Self-organizing neural network hierarchy.

Notifications You must be signed in to change notification settings

satyaborg/self-organizing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

self-organizing

Research into self organizing hierarchies for artificial neural networks.

Key highlights

  • Accepted at AJCAI 2020. [paper]
  • Accepted at NAISys 2020 for poster presentation. [poster]

Installation

To install the dependencies and train, execute the following:

virtualenv -p python3 venv
source venv/bin/activate
chmod +x train.sh 
./train.sh

Note: All arguments and hyperparameters to the program can be modified under the config.yaml file.

Results

  • Accuracy vs Max Entropy (global) alt feature map

  • Accuracy vs Meta-steps alt feature map

TODO

  • Refactor the training of handcrafted architectures
  • Handle both single and multi-channel inputs (CIFAR10)
  • Add code profiler
  • Refactor the meta-learner: Simulated Annealing

About

Experiments and code for our paper Self-organizing neural network hierarchy.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published