Skip to content

Latest commit

 

History

History
47 lines (38 loc) · 1.27 KB

README.md

File metadata and controls

47 lines (38 loc) · 1.27 KB

Exploring and Exploiting Latent Commonsense Knowledge in Pretrained Masked Language Models

Codebase for the paper "Exploring and Exploiting Latent Commonsense Knowledge in Pretrained Masked Language Models".

Note: under maintenance, will be complete soon.

Current supported models:

  • DistilBERT-base
  • BERT(base, large, etc.)
  • RoBERTa(base, large, etc.)
  • MPNet

Prepare the codebase

git clone https://github.com/DRSY/LAMP.git && cd LAMP
pip install -r requirements.txt

Run pruning and probing

Specify parameters about probing experiments in a separate params file, then run:

make -f Makefile probe

detailed hyperparameters can be found in probe.sh.

Run GLUE

Specify parameters about GLUE experiments in a separate params file, then run:

make -f Makefile glue

Clean the log files

make -f Makefile clean

detailed hyperparameters can be found in glue.sh.