Skip to content

This project aims to generate a model zoo of Language Transformers to enable hyper-representation learning

Notifications You must be signed in to change notification settings

kaaydin/hyper-representation-learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Model Zoo for Language Transformers

Introduction

Neural Networks (NNs) evolve on unique trajectories in weight space during training - depending on hyperparameters and weight initialization - leading to different model parameters (i.e., weights & biases) & minima on the loss surface. A population of NNs (or a model zoo) can form structures in weight space that contain information about the state of training and can reveal latent properties of individual models (e.g., accuracy). With model zoos, we can investigate novel approaches for a variety of use cases, such as model analysis, discovering learning dynamics & generative modelling of NNs.

For our project, we will look at Language Transformers, focusing on BERT.

-- tbd --

About

This project aims to generate a model zoo of Language Transformers to enable hyper-representation learning

Topics

Resources

Stars

Watchers

Forks

Languages